No edit summary |
|||
(14 intermediate revisions by 2 users not shown) | |||
Line 1: | Line 1: | ||
===Project=== | |||
'''Chicken Tracker''' | |||
We humans communicate. At work, with friends, colleagues and with relatives. To classify the messages communicated to us, we use our senses. In direct verbal communication, our auditory perception is primarily engaged. Peripherally one could think the acoustic channel is clear and unmistakable. What may be simply true in individual cases, however, reveals itself in everyday life as a diverse, complex act. Human communication and perception is multi-layered. Besides the acoustic channel we use optical, olfactory as well as tactile stimuli to communicate with our fellow human being. What happens to the understanding of our counterparts when visual communication is restricted, feeling and smelling are even completely eliminated, and when eye contact, facial expressions and gestures are distorted by the Internet?
| We humans communicate. At work, with friends, colleagues and with relatives. To classify the messages communicated to us, we use our senses. In direct verbal communication, our auditory perception is primarily engaged. Peripherally one could think the acoustic channel is clear and unmistakable. What may be simply true in individual cases, however, reveals itself in everyday life as a diverse, complex act. Human communication and perception is multi-layered. Besides the acoustic channel we use optical, olfactory as well as tactile stimuli to communicate with our fellow human being. What happens to the understanding of our counterparts when visual communication is restricted, feeling and smelling are even completely eliminated, and when eye contact, facial expressions and gestures are distorted by the Internet?
| ||
[[File:photo_2021-01-14 17.16.54.jpeg|400px]] | |||
The current pandemic and the associated lockdown limit verbal, interpersonal exchange to two elementary channels. Acoustic communication and visual communication made possible by video telephony are coming to the fore in our everyday lives. However, contact via the medium of the Internet lacks the possibility of expressing empathy and feelings. For this purpose I have invented an „Chicken Tracker".
As a support tool to the usual video chat, the „Chicken Tracker" is supposed to visually amplify a feeling.
The basic idea is simple. Participant A starts talking, initially appears completely realistic in the video chat - if A talks himself into a rage, the volume of his acoustic output increases steadily. The amplified signal in turn turns his skin color increasingly red through the | The current pandemic and the associated lockdown limit verbal, interpersonal exchange to two elementary channels. Acoustic communication and visual communication made possible by video telephony are coming to the fore in our everyday lives. However, contact via the medium of the Internet lacks the possibility of expressing empathy and feelings. For this purpose I have invented an „Chicken Tracker".
As a support tool to the usual video chat, the „Chicken Tracker" is supposed to visually amplify a feeling.
The basic idea is simple. Participant A starts talking, initially appears completely realistic in the video chat - if A talks himself into a rage, the volume of his acoustic output increases steadily. The amplified signal in turn turns his skin color increasingly red through the "Chicken Tracker" - until A appears completely red. Once in a rage, the volume of A's output increases until it collapses and A turns into a chicken for the time in which the volume is higher than X. | ||
The whole idea is about loosening up in online discussions and not taking yourself too seriously. | |||
[[File:photo_2021-01-14 17.15.54.jpeg|400px]] | |||
===Development=== | |||
The project has several stages of development | |||
====change certain colours in Max video - using grab==== | |||
<gallery> | |||
File:Bildschirmfoto 2021-01-18 um 17.15.33.png|patch exchanging bright / dark colours by a matrix. | |||
</gallery> | |||
==== tracking human skin color by RGB-Code==== | |||
<gallery> | |||
File:Bildschirmfoto 2021-01-15 um 09.44.05.png|Source: https://colorswall.com/palette/2513/ | |||
</gallery> | |||
====Changing color by sound==== | |||
*test 1 [[:File:Max-test1.maxpat]] | |||
*test 2 [[:File:Max-test2 .maxpat]] | |||
====Changing skin color to red by volume==== | |||
====Building two interconnected patches==== | |||
*to use OSC protocol | |||
*alternatively „jit.net.send/recv“ | |||
===Exercises=== | |||
"Greenscreen" | "Greenscreen" | ||
Line 27: | Line 45: | ||
1. try 2 videos in 1 patch - goal will be they run mixed into each other | 1. try 2 videos in 1 patch - goal will be they run mixed into each other | ||
[[:File:VIDEO_PLAY1.maxpat]] | *[[:File:VIDEO_PLAY1.maxpat]] | ||
https://www.studioroosegaarde.net/info | *https://www.studioroosegaarde.net/info | ||
https://www.studioroosegaarde.net/project/space-waste-lab | *https://www.studioroosegaarde.net/project/space-waste-lab |
Latest revision as of 15:34, 6 April 2021
Project
Chicken Tracker
We humans communicate. At work, with friends, colleagues and with relatives. To classify the messages communicated to us, we use our senses. In direct verbal communication, our auditory perception is primarily engaged. Peripherally one could think the acoustic channel is clear and unmistakable. What may be simply true in individual cases, however, reveals itself in everyday life as a diverse, complex act. Human communication and perception is multi-layered. Besides the acoustic channel we use optical, olfactory as well as tactile stimuli to communicate with our fellow human being. What happens to the understanding of our counterparts when visual communication is restricted, feeling and smelling are even completely eliminated, and when eye contact, facial expressions and gestures are distorted by the Internet?
The current pandemic and the associated lockdown limit verbal, interpersonal exchange to two elementary channels. Acoustic communication and visual communication made possible by video telephony are coming to the fore in our everyday lives. However, contact via the medium of the Internet lacks the possibility of expressing empathy and feelings. For this purpose I have invented an „Chicken Tracker". As a support tool to the usual video chat, the „Chicken Tracker" is supposed to visually amplify a feeling. The basic idea is simple. Participant A starts talking, initially appears completely realistic in the video chat - if A talks himself into a rage, the volume of his acoustic output increases steadily. The amplified signal in turn turns his skin color increasingly red through the "Chicken Tracker" - until A appears completely red. Once in a rage, the volume of A's output increases until it collapses and A turns into a chicken for the time in which the volume is higher than X.
The whole idea is about loosening up in online discussions and not taking yourself too seriously.
Development
The project has several stages of development
change certain colours in Max video - using grab
tracking human skin color by RGB-Code
Changing color by sound
- test 1 File:Max-test1.maxpat
- test 2 File:Max-test2 .maxpat
Changing skin color to red by volume
Building two interconnected patches
- to use OSC protocol
- alternatively „jit.net.send/recv“
Exercises
"Greenscreen"
1. try 2 videos in 1 patch - goal will be they run mixed into each other