GMU:Max and the World/Paulina Magdalena Chwala: Difference between revisions

From Medien Wiki
No edit summary
No edit summary
Line 11: Line 11:


I was very interested to create an interactive VR experience. Unlikely during the pandemic, it’s not possible to use the university rooms. I tried to develop a basic interactive installation in Unity instead. The interaction will be based on the position of the user. If the person will come closer to the sensor, the display will change according to it. I want to create a simple experience, a situation, that would be not possible in the real world.   
I was very interested to create an interactive VR experience. Unlikely during the pandemic, it’s not possible to use the university rooms. I tried to develop a basic interactive installation in Unity instead. The interaction will be based on the position of the user. If the person will come closer to the sensor, the display will change according to it. I want to create a simple experience, a situation, that would be not possible in the real world.   
[[File:Unityosc12356.gif]]


The installation is based on Arduino ultrasonic sensor, Max Msp loop, and Unity real-time rendering. I create a room with light up spheres in Unity, that is displayed on the laptop monitor. In front of the monitor, the ultrasonic sensor is installed. Each time the user approaches the screen, the installation reacts with the sound and movement of the spheres.  
The installation is based on Arduino ultrasonic sensor, Max Msp loop, and Unity real-time rendering. I create a room with light up spheres in Unity, that is displayed on the laptop monitor. In front of the monitor, the ultrasonic sensor is installed. Each time the user approaches the screen, the installation reacts with the sound and movement of the spheres.  
Line 18: Line 16:
data loop: ultrasonic sensor -> MaxMsp (sound) -> OSC protocol -> Unity (image)  
data loop: ultrasonic sensor -> MaxMsp (sound) -> OSC protocol -> Unity (image)  


<gallery>
[[File:Board444.jpeg]]
File:Board444.jpeg| Arduino ultrasonic sensor detects the position of the user  
 
File:Arduino loop1.jpeg|Data is collected and transformed in the Arduino code
Arduino ultrasonic sensor detects the position of the user  
File:Max loop2.jpeg|Data is received, unpacked and transformed in the Max Msp loop, creates sound according to data   
 
File:Unity33.jpeg|Data is sent from Max Msp to Unity by the OSC protocol, external c# plugins in Unity receive the data and transform objects in the scene  
[[File:Arduino loop1.jpeg]]
</gallery>
 
Data is collected and transformed in the Arduino code
 
[[File:Max loop2.jpeg]]
Data is received, unpacked and transformed in the Max Msp loop, creates sound according to data   
 
[[File:Unity33.jpeg]]
Data is sent from Max Msp to Unity by the OSC protocol, external c# plugins in Unity receive the data and transform objects in the scene  
 


===Assignments===
===Assignments===

Revision as of 20:02, 2 November 2020

Interactive Installation

I will create a room with light up spheres in Unit, that will be display on laptop monitor. In front of the monitor the ultrasonic sensor will be installed. Each time the user will approach the screen, the installation will react with sound and movement of the spheres.

The installation uses Arduino ultrasonic sensor, Max Msp loop and Unity real-time rendering: ultrasonic sensor -> MaxMsp (sound) -> OSC protocol -> Unity (image)

Implementation

I was very interested to create an interactive VR experience. Unlikely during the pandemic, it’s not possible to use the university rooms. I tried to develop a basic interactive installation in Unity instead. The interaction will be based on the position of the user. If the person will come closer to the sensor, the display will change according to it. I want to create a simple experience, a situation, that would be not possible in the real world.

The installation is based on Arduino ultrasonic sensor, Max Msp loop, and Unity real-time rendering. I create a room with light up spheres in Unity, that is displayed on the laptop monitor. In front of the monitor, the ultrasonic sensor is installed. Each time the user approaches the screen, the installation reacts with the sound and movement of the spheres.

data loop: ultrasonic sensor -> MaxMsp (sound) -> OSC protocol -> Unity (image)

Assignments

Homework 1

Homework 2

Homework 3

Homework 4

Midterm Presentation

Progress1

CONCEPT

FINAL