GMU:Digital Puppetry Lab/Group Play with rhythms

From Medien Wiki
< GMU:Digital Puppetry Lab
Revision as of 15:48, 29 May 2016 by Kanfeng (talk | contribs) (Created page with "== Group work By: == - Kan Feng: Processing visualization, Arduino - Di Yang: MaxMsp visualization - Shuyan Chen: MaxMsp sound -JiXiang Jiang: Documentation == Introduction: == ...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Group work By:

- Kan Feng: Processing visualization, Arduino - Di Yang: MaxMsp visualization - Shuyan Chen: MaxMsp sound -JiXiang Jiang: Documentation

Introduction:

In our group, we want to realize that graphics and physics can be outputs which perform the synchro feedback of visualization and sound reacting to the same controls in mobile way. And creating a performance that combines the visualization with the sound which will be rhythms that generated in sound card in laptops, physicals in arduino side and the recording of environmental sound.

We were using TouchOsc as an input to send data of controls to Processing through osc messages, once received and analyzed by processing, then the data continues to be sent to Maxmsp in others' computer to control the visualization and sound in computers and also Arduino side to control different physicals which can generate different sounds. Hence, we can use the same controls in our mobile phones to achieve the same parameters which create the visual and audio interaction in both graphical and physical way.