Collaboration Idea:
Involves: Body Movement Singer Electronic music Real-time Control of 3D Graphics via Audio Various Wireless Sensors Attached to the Dancer's Clothing, Hands, and shoes Wireless Microcontroller Custom Software Developed for Manipulating Objects in OpenGL
People: Elisabet Curbelo Colin Zyskowski … Anyone else who cares to be involved
Product: A singer will be on stage in front of a screen. The singer will have force sensors and piezo electric sensors on her shoes, proximity sensors on her hands, and light sensors on her torso. All of these sensors will be connected to a microcontroller with a wireless interface. There will be a hub that receives the wireless data from the singer's sensors. The data will then be used to create or alter sounds - for example, a step could trigger an electronic sound, while the proximity of the hands alters the amount of delay that is added to the voice of the singer. Simultaneously, the sound that is created will trigger events in the 3D graphics software to be presented on the screen behind the singer. The volume of a particular sound could be linked to various coordinates of one or multiple 3d objects, for example. Or perhaps the position of the singer on stage could relate directly to the position of the OpenGL camera in 3D space.
The purpose of this project is to link, as closely as possible and in real time, movement, sound, and light.
Previous music by the composer: http://www.elisabetcurbelo.com/elisabetcurbelo/kara_toprak_en.html http://www.elisabetcurbelo.com/elisabetcurbelo/Mikrop_en.html