385
edits
(91 intermediate revisions by the same user not shown) | |||
Line 36: | Line 36: | ||
[[File:AA4.jpg]] | [[File:AA4.jpg]] | ||
<br style="clear:both"> | |||
Additionally, when the participant stop making connections with others too long, the character in the virtual scene will slowly fade away… as mentioned above, the existence of human being will be meaningless without connection and communication with others. | Additionally, when the participant stop making connections with others too long, the character in the virtual scene will slowly fade away… as mentioned above, the existence of human being will be meaningless without connection and communication with others. | ||
Line 41: | Line 42: | ||
In a word, by allowing the participants play a role in the interactive scene, it’s a concept strengthening the necessity and significance of the connection and communication with others. | In a word, by allowing the participants play a role in the interactive scene, it’s a concept strengthening the necessity and significance of the connection and communication with others. | ||
== Technical setting-up == | |||
[[File:AA5.png|thumb|left|200px|projected rainfall]] | |||
As my original idea using Blender and Unity or Unreal engine to create a VR environment is too complicated and it's hard to set up just using simple VR sets like Google cardboard and interact with multiple participants at the same time. | |||
So, I finally chose to use some easier approach to do my project, but still in some role-play mode. | |||
From the sketch for 3D and develop the concept of a plain and simple way to show the rains around the participants by using projector. | |||
[[File:ddd12.png|600px|Diagram]] | |||
---- | |||
[[File:AA6.png|thumb|left|200px|]] | |||
[[File:AA7.png|thumb|left|200px|]] | |||
Just like the pictures on the left side, I will use tracking system to locate the participants and project the dots of rainfall to the ground where the participants are actually standing. And the circles and dots will change its size according to the position of the participants. The patterns indicate the 'inner feeling' of the participants. (The feeling is not your real feeling but a simulated one in this specific environment.) At the same time, the screen behind will display the corresponding curve like electrocardiogram of each participants in different colors with their own rhythms of sounds. By looking at the curves on the screen and hearing the sound, the participants can get a notion of others' 'feeling' in this interactive 'game'. When people get close, the rhythms will be alike and the dots will merge. | |||
'''Updates:''' | |||
Instead of using the tracking system, I am using the color detection sketch from processing to capture certain objects' movements. Ideally, it should be able to track several single-color objects. But in the real world, the webcam just not accurate enough for the tracking and it depends on the light condition extremely. | |||
'''Object tracking test''' | |||
[https://drive.google.com/open?id=0BwFYVbmk3aHYY0tKMmlKZ3ZHeXc] | |||
Sketch based on James Alliban's work: [https://jamesalliban.wordpress.com/2008/11/16/colour-detection-in-processing/] | |||
---- | |||
'''Ground graphics:''' | |||
Just as I mentioned above, the field is a coordinate and the dots will change according to the location data. | |||
[[File:pp1.png]] | |||
[[File:pp2.png]] | |||
[[File:pp3.png]] | |||
[[File:pp4.png]] | |||
[[File:pp5.png]] | |||
[[File:pp6.png]] | |||
The particles will stop moving when two participants get close enough just like the 5th pictures showing. | |||
---- | |||
'''Sound feedback:''' | |||
[[File:BBuzzer.png|thumb|left|300px]] | |||
[[File:B&ard.png|thumb|left|300px]] | |||
<br style="clear:both"> | |||
This is my original idea of sound feedback using individual buzzer for each single-color object. The melody or sound will be played, when the distance between every two objects are close or far enough . But during my test with syncing the tracking data with the arduino, I found the buzzer couldn't work as the way I wanted, one melody for one specific pin will go through the all three buzzers. I could't solve this, so I had no choice but looking for another solution. And the simplest solution is just play the audio through the processing sketch from computer speakers. Which is not so instinct and natural comparing with the original plan. But it's still very clear to explain the meaning when the objects getting close or parting with corresponding sound effects(close - heartbeat/ far - heartbeat fading away). | |||
<br style="clear:both"> | |||
== Final Result == | |||
I tried to project images from a locker onto the table, but I found I cannot set my camera properly. So, I have to use magnets which can be attached to the locker which is used as the ground, and put my webcam on the table, so I can project the animations to the correct position. | |||
<br style="clear:both"> | |||
[[File:cppp.png|thumb|left|600px|Diagram]] | |||
<br style="clear:both"> | |||
== | [[File:1l1.png|thumb|left|310px|Composition]] | ||
[[File:2l1.png|thumb|left|310px|Projector]] | |||
[[File:4l1.png|thumb|left|310px|Graphics]] | |||
<br style="clear:both"> | |||
[[File:5l1.png|thumb|left|500px|Screenshot-1-Color Detection]] | |||
[[File:6l1.png|thumb|left|400px|Screenshot-2-Close]] | |||
[[File:7l1.png|thumb|left|400px|Screenshot-3-Far]] | |||
<br style="clear:both"> | |||
The setting is able to track three objects and measure the distance between each other and trigger certain interactions mentioned above, although the tracking is not stable.So for the time being, two objects are proper for the interaction. | |||
'''Tests''' | |||
[https://drive.google.com/open?id=0BwFYVbmk3aHYTFpWRDVWTlhwUWc] | |||
[https://drive.google.com/open?id=0BwFYVbmk3aHYT2gtU2RyQVdhN1k] | |||
[https://drive.google.com/open?id=0BwFYVbmk3aHYdW9ZaDFScGl1X00] | |||
== Reference == | |||
Generative music in Supercollider & Processing - [https://www.youtube.com/watch?v=rMbcqv8rxnA] | |||
Barbican's Rain Room: it's raining, but you won't get wet - [https://www.youtube.com/watch?v=EkvazIZx-F0] | |||
Processing tutorial: Overview of data visualization | lynda.com - [https://www.youtube.com/watch?v=T5lRLA_Vn7o] | |||
Tracking Performance using Kinect - [https://vimeo.com/99301608] | |||
Sherry Turkle, Alone Together: Why We Expect More from Technology and Less from Each Other | |||
Jenova Chen, Journey - [http://thatgamecompany.com/games/journey/] | |||
Arduino to Maxmsp via OSC gudie - [http://liamtmlacey.blogspot.de/2011/03/arduino-to-maxmsp-via-osc-guide-and.html] | |||
Color detection in Processing - [https://jamesalliban.wordpress.com/2008/11/16/colour-detection-in-processing/] | |||
OpenCV Tutorial: Real-Time Object Tracking Without Colour - [https://www.youtube.com/watch?v=X6rPdRZzgjg] | |||
Multiple Object Detection with Color Using OpenCV - [https://www.youtube.com/watch?v=hQ-bpfdWQh8] | |||
Play Melody - [https://www.arduino.cc/en/Tutorial/PlayMelody] | |||
TouchOSC - [http://hexler.net/software/touchosc] |
edits