How does it feel to interact naturally with sound? Without touching anything physical?
I decided to experiment with the topic for NYC Media Lab's Future Interfaces, ran at Razorfish NYC last Tuesday. Natural User Interfaces (NUIs) have been getting a shot in the arm recently with Kinect V2, and the immediacy of the interaction is becoming really clear.
Something about this style of interaction for me lends itself to the 'escape from the computer'. This is the idea that what you are really doing is interacting with the environment, with the sound, vision and space - not a computer. Therefore I focus entirely on the immersive qualities of the experience, such that the 'control' doesn't really feel like control at all.
In this installation I focused solely on the hands, and their vertical and horizontal positioning in relation to the sensor. These values are fed to a synthesizer and effects chain, to allow an intuitive interaction with continuously generated sound.
People took pretty easily to the experiment - and that's the stage it is at, pure experiment. This builds on the ofxKinectV2-OSC plugin I created recently, and Adam Carlucci's excellent work with openFrameworks and audio units.