the most recent project im happy to be part of, is Musiklusion. Musiklusion was brought to life by andreas brandt. The aim of the project is to enable people with both mental and physical disabilities, to create music. therefore we make use of all kind of different input technologies like arduinos or the htc vive.
im happy to announce that i - as part of the Super Nubibus Team - has been allowed to exhibit at ZKM | Zentrum für Kunst und Medien in Karlsruhe Germany. we developed a virtual balloon ride over the 19th century karlsruhe in which we make use of several interactive elements to enhance the immersiveness of the experience.
as part of an art and research group i was lucky enough to exhibit our installation "ideal spaces" at the Venice Architectural Biennale 2016. the exhibition is a compilation of three single parts, a cave-like projection, an interactive table and a world disk, which, in combination with the other parts, portrays a montage of the history of planned spaces. if you would like to read in further detail about each world or for more background information about how the exhibition was planned and produced, please visit idealspaces.org.
since 2014, we have been developing an app to bring city communities back together. Engage with your city, your neighbourhood, your campus! Become part of your local community and stay connected with your city / zone. Depending on your location, you’ll find cities, neighbourhoods, campuses and many other Zones. Choose a Zone from the menu and communicate with your local community. Every Zone comes with its own newsfeed. It will show you what others in your area are excited or talking about. Join the conversation and share your own text and pictures. Your posts will appear in the newsfeed. It’s all about the places you’re in. Organise events, get in touch with others or share and receive updates of what’s happening around you. Share what you like sharing!
Technology like processing, the microsoft kinect and max/msp offer mind-blowing possibilities, allowing you to build your very own interactive haptic table in just one (short) semester. The idea behind the haptic sound machine was to create a fully interactive experience interface for sound. depending the position and strength of your touch on the fabric surface you can manipulate the speed and the echo of the sound in realtime.
at the end of 2015 i got invited to a leisurely get together at a friend's house. during our evening together we wanted to play a nintendo nes style competition on the living room tv screen using our individual mobile phones. We couldn't find a single game that matched our requirements, so we created our own.
our world is in a constant motion. information flood our brains from dawn till dusk. we wanted to draw attention to some of the information people tend to fade out on a day-to-day basis but should not be forgotten. that is why we developed since. an app that offers you personalized information dependent on the time since you've installed your app.
when happy little coincidences come your way, sometimes you have to follow them. it is fun to play vr games for the first several times, but it is way more fun to build a game the way you like it. as we got a brand new htc vive within our smart space lab we decided to use this opportunity to experiment on an unknown territory. we wanted to investigate how hard or easy it is to build your own 3d game within a virtual space.
since i was pretty sure about the main topic of my masterthesis -haptics-, i wanted to get beyond the limits of the htc vive's vr controller. i created a little lab in which you can play several instruments like a xylophone and a drum set. since it's just an early test environment you can also manipulate the controller's haptic feedback settings from within the scene in unity3d.
with the rise of deep learning algorithms and artificial intelligence, there has been a lot of free apis over the last years with which you could do fast and awesome experiments. especially last year, when ibm's watson api was free to use, we built a small facebook app, which was able to determine your personality type based on what you've had written in your messages. we also used the theory behind watson (big five model) to find out, whether one's self-perception meets other peoples perception of your instagram profile.
in general, i am not a huge fan of those interactive videos marketers come up with every once in a while. in my opinion an interactive video prevents you from getting into this feeling of "flow" most of us are familiar with from really good movies, and it lacks of interaction as known from immersive computer games. so it's basically a mixture of two things similar like whisky cola. i like a good whisky sometimes and i like an ice cold cola sometimes, but blended together, it just tastes wrong (at least imho). with this in mind, i was very happy when our lecturer (for interactive video) came up with an idea: manipulating video in real time, so that you as a spectator can give it a completely different meaning. after some interesting discussions, we came up with several conditions:
the core idea of this installation was to subconsciously confront the user with the fact, that their daily behaviour can have a tremendous effect of the environment, especially regarding animals. the users should become sensitized about the unaware consequences of their daily life. our aim was to draw attention to the simple and easy things you could change in your daily behaviour to be part of a better environment. it was important to us not only to point out the mistakes everybody does from day to day, but to point out easy solutions that require minimal, or even no real effort.