GMU:Max and the World/Paulina Magdalena Chwala: Difference between revisions

From Medien Wiki
No edit summary
No edit summary
Line 1: Line 1:
===Interactive Installation===
I will create a room with light up spheres in Unit, that will be display on laptop monitor. In front of the monitor the ultrasonic sensor will be installed. Each time the user will approach the screen, the installation will react with sound and movement of the spheres.
The installation uses Arduino ultrasonic sensor, Max Msp loop and Unity real-time rendering:
ultrasonic sensor -> MaxMsp (sound) -> OSC protocol -> Unity (image)
[[File:YouCut 20200907 193520892 Trim.mp4|600px]]
[[File:YouCut 20200907 193520892 Trim.mp4|600px]]


INTERACTION WITH UNITY
===Implementation===


I was very interested to create an interactive VR experience. Unlikely during the pandemic, it’s not possible to use the university rooms. I tried to develop a basic interactive installation in Unity instead. The interaction will be based on the position of the user. If the person will come closer to the sensor, the display will change according to it. I want to create a simple experience, a situation, that would be not possible in the real world.   
I was very interested to create an interactive VR experience. Unlikely during the pandemic, it’s not possible to use the university rooms. I tried to develop a basic interactive installation in Unity instead. The interaction will be based on the position of the user. If the person will come closer to the sensor, the display will change according to it. I want to create a simple experience, a situation, that would be not possible in the real world.   


[[File:Unityosc12356.gif]]
[[File:Unityosc12356.gif]]
______________________________________________________________________________________
IMPLEMENTATION


The installation is based on Arduino ultrasonic sensor, Max Msp loop, and Unity real-time rendering. I create a room with light up spheres in Unity, that is displayed on the laptop monitor. In front of the monitor, the ultrasonic sensor is installed. Each time the user approaches the screen, the installation reacts with the sound and movement of the spheres.  
The installation is based on Arduino ultrasonic sensor, Max Msp loop, and Unity real-time rendering. I create a room with light up spheres in Unity, that is displayed on the laptop monitor. In front of the monitor, the ultrasonic sensor is installed. Each time the user approaches the screen, the installation reacts with the sound and movement of the spheres.  

Revision as of 20:01, 2 November 2020

Interactive Installation

I will create a room with light up spheres in Unit, that will be display on laptop monitor. In front of the monitor the ultrasonic sensor will be installed. Each time the user will approach the screen, the installation will react with sound and movement of the spheres.

The installation uses Arduino ultrasonic sensor, Max Msp loop and Unity real-time rendering: ultrasonic sensor -> MaxMsp (sound) -> OSC protocol -> Unity (image)

Implementation

I was very interested to create an interactive VR experience. Unlikely during the pandemic, it’s not possible to use the university rooms. I tried to develop a basic interactive installation in Unity instead. The interaction will be based on the position of the user. If the person will come closer to the sensor, the display will change according to it. I want to create a simple experience, a situation, that would be not possible in the real world.

Unityosc12356.gif

The installation is based on Arduino ultrasonic sensor, Max Msp loop, and Unity real-time rendering. I create a room with light up spheres in Unity, that is displayed on the laptop monitor. In front of the monitor, the ultrasonic sensor is installed. Each time the user approaches the screen, the installation reacts with the sound and movement of the spheres.

data loop: ultrasonic sensor -> MaxMsp (sound) -> OSC protocol -> Unity (image)


Board444.jpeg

Arduino ultrasonic sensor detects the position of the user

Arduino loop1.jpeg

Data is collected and transformed in the Arduino code

Max loop2.jpeg Data is received, unpacked and transformed in the Max Msp loop, creates sound according to data

Unity33.jpeg Data is sent from Max Msp to Unity by the OSC protocol, external c# plugins in Unity receive the data and transform objects in the scene


Assignments

Homework 1

Homework 2

Homework 3

Homework 4

Midterm Presentation

Progress1

CONCEPT

FINAL