330
edits
No edit summary |
|||
(31 intermediate revisions by the same user not shown) | |||
Line 19: | Line 19: | ||
== Diagram: == | == Diagram: == | ||
[[File: | [[File:DiagramGroupLab.jpg]] | ||
== Implementation: == | == Implementation: == | ||
Line 39: | Line 39: | ||
'''Processing:''' | '''Processing:''' | ||
As a gate to receive the data from those controls, Processing in Kan's laptop was used to create a testing visualization reacting to the color, position, amounts and rotation speed information sent from controls in TouchOsc, nevertheless, it plays the key role to send these messages to Arduino in Kan's side and Maxmsp in Di's laptop | As a gate to receive the data from those controls, Processing in Kan's laptop was used to create a testing visualization reacting to the color, position, amounts and rotation speed information sent from controls in TouchOsc, nevertheless, it plays the key role to send these messages to Arduino with OSC messages through serial port in Kan's side and Maxmsp with OSC messages through UDP to Di's laptop simultaneously. | ||
[[File:P51st.jpg]] | |||
[[File:P52nd.jpg]] | |||
[[File:P5Graphics.jpg]] | |||
[[File:P5Visualization.jpg]] | |||
'''Arduino:''' | |||
Servo, Buzzer and RGB Led were used as physics to read the data of controls in TouchOsc sent from Processing, and due to the variables of RGB Rotary and Speed Fader, the color of RGB Led and sound of Servo(generated from rotation speed) and Buzzer react synchronously. | |||
[[File:Arduino1st.jpg]] | |||
[[File:ArduinoReact.jpg]] | |||
'''MaxMsp:''' | |||
To create a gorgeous visualization, shader and noise rendering was used in Maxmsp patch, which was planned to be our main visualization as in a VJ performance, cause what's more interesting is that it can be controlled by the RGB Rotary and position controls in TouchOsc smoothly but surprisingly at the same time. UDPreceive is the key point to listen to the messages sent from processing, 12345 is the same remote port needs to be set in both Processing and Maxmsp, hence with the protocol of CNMAT created by [http://cnmat.berkeley.edu/ cnmat], the communication works between Processing and Maxmsp. | |||
[[File:maxmsp1.jpg]] | |||
[[File:maxmsp.jpg]] | |||
[[File:MaxmspPatchGroup.jpg]] | |||
'''3. Output''' | |||
Our group's rhythm was based on the visualization and audio mentioned above, another Maxmsp patch was programmed to collect the sound from Di's laptop, Kan's physics and environment around to play in loop, and you can just simply press the space bar in Shuyan's laptop to create and add new rhythms into the melody. Inside this patch, ASCII code(32 which represent space bar in our case) was used to define the key to be pressed to record new sound. | |||
Under the cooperation between visualization and audio in both laptops and physics, we can just simply use one mobile phone, iPad or any other mobile device to create and play with your own rhythms melody. You are gonna have a user experience in interface, visualization and audio generated from physics output. Enjoy it! | |||
[[File:MaxmspShuyan.jpg]] | |||
[[File:OutputGroup.jpg]] | |||
'''Video Link''' | |||
https://www.youtube.com/watch?v=enNGtWU-nss&feature=youtu.be |
edits