GMU:Home Made Bioelectronics/Marah Doleh/Docummentation of progress: Difference between revisions
No edit summary |
No edit summary |
||
(8 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
==CONCEPT== | |||
===What has been done so far: | In this project, I plan to create an interactive installation that reflects this relationship between architectural space and music. While putting an emphasis on the emotional aspect of this experience. | ||
The installation will have a musical instrument, VR glasses, and HR sensor. The participater is is allowed to play any music he/she wants and to change the music they are playing whenever the well through the experience. | |||
The interaction starts after the participator starts playing and an emotional change is detected. This will trigger an audio-reactive environment to appear. The interactive environment will keep changing with respect to the emotional change during the musical performance, which will (probably) in return lead to affect the emotion of the participator. By this participator is influenced by music, changing his emotions, influencing the virtual space, followed by another emotional change, changing the music, ect.. This closed feedback loop is an embodiment of architecture, music, and emotion relationship. | |||
==TECHNICAL ASPECT== | |||
This installation will be divided into three main parts; emotion detector (HR sensor+arduino+python), music (piano or any other instrument), and audio-reactive environment (touchdesigner viewed by VR glasses). | |||
==AESTHETICAL ASPECT== | |||
Although the program that will be used and the general vision of the visual outcome of this interaction is predetermined; an interactive architectural space that is changed by emotions and music. The design is not yet clear, and will be worked on during this semester in the Touchdesigner course I am currently taking. | |||
==What has been done so far:== | |||
Line 11: | Line 29: | ||
[[File:TD-trim.mp4|600px]] | [[File:TD-trim.mp4|600px]] | ||
Links used for this step: | |||
[https://youtu.be/83K3QEK6Iv0 Feedback in Touchdesigner], | |||
[https://youtu.be/V_Q_fDukTI0 TOUCHDESIGNER & ARDUINO] | |||
*Audio-reactive 3D model | |||
[[File:Audioreactive test.mp4|600px]] | |||
Links used for this step: | |||
[https://youtu.be/SJZIMGg-thY Instancing in Touchdesigner], | |||
[https://youtu.be/R7sAomk2vR4 Touchdesigner audio analysis] | |||
==What's next?== | |||
====Arduino==== | |||
*Send '''pulse''' data to TD | |||
*Send to Emotion recognition code | |||
====Python | Emotion recognition==== | |||
*Emotion recognition | |||
*Send data to TD | |||
====Touchdesinger==== | |||
*test other forms of 3D (point clouds/ more complex instancing/....) | |||
*try other sets of reactions to music | |||
*create multiple scenes for different emotions | |||
*intersection with Arduino: reaction to the '''pulse''' input | |||
*intersection with python: trigger change in scenes by the '''emotional''' input | |||
==List of readings and references:== | |||
*https://www.youtube.com/watch?v=yvbThD835m0 | |||
*https://github.com/lbugnon/emoHR | |||
*https://github.com/FarhatBuet14/Heart-Rate-Monitoring-with-Emotion-Detection-using-PPG-GSR-Sensors | |||
*https://www.youtube.com/watch?v=1clagERAiR4 | |||
*https://link.springer.com/article/10.1007/s11042-020-09576-0 | |||
*https://www.thinkmind.org/articles/etelemed_2016_7_40_40093.pdf | |||
*https://www.scitepress.org/papers/2015/52411/52411.pdf | |||
*https://www.researchgate.net/publication/342075389_Development_of_ECG_sensor_using_arduino_uno_and_e-health_sensor_platform_mood_detection_from_heartbeat | |||
*https://link.springer.com/chapter/10.1007/11941354_44 |
Latest revision as of 08:11, 12 July 2022
CONCEPT
In this project, I plan to create an interactive installation that reflects this relationship between architectural space and music. While putting an emphasis on the emotional aspect of this experience. The installation will have a musical instrument, VR glasses, and HR sensor. The participater is is allowed to play any music he/she wants and to change the music they are playing whenever the well through the experience. The interaction starts after the participator starts playing and an emotional change is detected. This will trigger an audio-reactive environment to appear. The interactive environment will keep changing with respect to the emotional change during the musical performance, which will (probably) in return lead to affect the emotion of the participator. By this participator is influenced by music, changing his emotions, influencing the virtual space, followed by another emotional change, changing the music, ect.. This closed feedback loop is an embodiment of architecture, music, and emotion relationship.
TECHNICAL ASPECT
This installation will be divided into three main parts; emotion detector (HR sensor+arduino+python), music (piano or any other instrument), and audio-reactive environment (touchdesigner viewed by VR glasses).
AESTHETICAL ASPECT
Although the program that will be used and the general vision of the visual outcome of this interaction is predetermined; an interactive architectural space that is changed by emotions and music. The design is not yet clear, and will be worked on during this semester in the Touchdesigner course I am currently taking.
What has been done so far:
- Testing receiving data from Arduino to Touchdesigner using Ultrasonic Distance Sensor (only for this test)
- Manipulating 3D object using data received from Arduino
Links used for this step: Feedback in Touchdesigner, TOUCHDESIGNER & ARDUINO
- Audio-reactive 3D model
Links used for this step: Instancing in Touchdesigner, Touchdesigner audio analysis
What's next?
Arduino
- Send pulse data to TD
- Send to Emotion recognition code
Python | Emotion recognition
- Emotion recognition
- Send data to TD
Touchdesinger
- test other forms of 3D (point clouds/ more complex instancing/....)
- try other sets of reactions to music
- create multiple scenes for different emotions
- intersection with Arduino: reaction to the pulse input
- intersection with python: trigger change in scenes by the emotional input
List of readings and references:
- https://www.youtube.com/watch?v=yvbThD835m0
- https://github.com/lbugnon/emoHR
- https://github.com/FarhatBuet14/Heart-Rate-Monitoring-with-Emotion-Detection-using-PPG-GSR-Sensors
- https://www.youtube.com/watch?v=1clagERAiR4
- https://link.springer.com/article/10.1007/s11042-020-09576-0
- https://www.thinkmind.org/articles/etelemed_2016_7_40_40093.pdf
- https://www.scitepress.org/papers/2015/52411/52411.pdf
- https://www.researchgate.net/publication/342075389_Development_of_ECG_sensor_using_arduino_uno_and_e-health_sensor_platform_mood_detection_from_heartbeat
- https://link.springer.com/chapter/10.1007/11941354_44