The topic of the WS 22/23 Lab was to create an interactive multiuser installation.

But it came with a twist: To make the outcome more interesting, it was necessary to implement some kind of knowledge dissonance in order to foster participant cooperation.

As usual, every student had to pitch one own idea at the beginning of the semester and based on that, groups were built.

One group had the idea to create use interactions, that by itself limits the information received. They set up a classic two 2 platformer game in which the virtual player was controller by blinking (eye blink tracking via mocap4face1) and blowing (anemometer using an Arduino2). This limited participant one’s sight when the character had to jump, and the other person could not use their voice because they were busy blowing into the anemometer. To turn directions, the two participants had to touch the others hand (also Arduino).

The idea behind project 2 was to lead people to guide a ship through a canyon while the person on the lookout was actually within immersive VR environment and the helmsman / navigator was placed in front of the 180° wall.

Regarding the dissonance in information, the person on the lookout saw obstacles the helmsman could not see, so they had to communicate with each other in order to guide through the canyon.


  1. mocap4face is a tool to gain blendshape changes from a simple RGB camera. The project was discontinued. ↩︎

  2. For the tutorial see here ↩︎