GMU:Human and Nonhuman Performances II SS16/Di Yang: Difference between revisions

From Medien Wiki
 
Line 112: Line 112:
<br style="clear:both">
<br style="clear:both">


This is my original idea of sound feedback using individual buzzer for each single-color object. The melody or sound will be played, when the distance between every two objects are close or far enough . But during my test with syncing the tracking data with the arduino, I found the buzzer couldn't work as the way I wanted, one melody for one specific pin will go through the all three buzzers. I could't solve this, so I had no choice but looking for another solution. And the simplest solution is just play the audio through the processing sketch from computer speakers. Which is not so instinct and natural comparing with the original plan.
This is my original idea of sound feedback using individual buzzer for each single-color object. The melody or sound will be played, when the distance between every two objects are close or far enough . But during my test with syncing the tracking data with the arduino, I found the buzzer couldn't work as the way I wanted, one melody for one specific pin will go through the all three buzzers. I could't solve this, so I had no choice but looking for another solution. And the simplest solution is just play the audio through the processing sketch from computer speakers. Which is not so instinct and natural comparing with the original plan. But it's still very clear to explain the meaning when the objects getting close or parting with corresponding sound effects(close - heartbeat/ far - heartbeat fading away).