322
edits
No edit summary |
No edit summary |
||
(26 intermediate revisions by the same user not shown) | |||
Line 19: | Line 19: | ||
A normal pulse is regular in rhythm and strength. Sometimes it's higher, sometimes lower, depending on the situation and health condition.You rarely see it and feel it from time to time. Apart from the visualisation of an EKG, what could it look like? How can this vital rhythm be visualised? | A normal pulse is regular in rhythm and strength. Sometimes it's higher, sometimes lower, depending on the situation and health condition.You rarely see it and feel it from time to time. Apart from the visualisation of an EKG, what could it look like? How can this vital rhythm be visualised? | ||
[[File:Pulse Interaction.jpg| | [[File:Pulse Interaction.jpg|800px]] | ||
I would like to use an Arduino pulse sensor to create an interactive installation. The setup contains the pulse sensor and a screen. The user can easily connect to the Max by clipping the Arduino sensor to its finger. The webcam is connected and the screen shows a pulsating and distorted image of the participant. The pulse sensor visually stretches the webcam image, moves in rhythm of the pulse and creates a new result. | I would like to use an Arduino pulse sensor to create an interactive installation. The setup contains the pulse sensor and a screen. The user can easily connect to the Max by clipping the Arduino sensor to its finger. The webcam is connected and the screen shows a pulsating and distorted image of the participant. The pulse sensor visually stretches the webcam image, moves in rhythm of the pulse and creates a new result. | ||
A more complex version would be the interaction of two pulses. Two Arduino pulse sensors are connected to | A more complex version would be the interaction of two pulses. Two Arduino pulse sensors are connected to Max and one distorted webcam image is showing the combination of the pulses from two people. This idea could be further developed during the semester. | ||
How does it work: Arduino pulse sensor that measures the pulse with the fingers. Usually, this is used to measure the radial pulse. In MAX, a patch must be created in which the data from Arduino is converted into parameters which distorts the image. | How does it work: Arduino pulse sensor that measures the pulse with the fingers. Usually, this is used to measure the radial pulse. In MAX, a patch must be created in which the data from Arduino is converted into parameters which distorts the image. | ||
Line 29: | Line 29: | ||
---- | ---- | ||
''' | '''27.06.21''' | ||
Since I want to set up a user-friendly installation and I failed building a pulse sensor with the equipment I have at home, I ordered one online. | Since I want to set up a user-friendly installation and I failed building a pulse sensor with the equipment I have at home, I ordered one online. Meanwhile, a moodboard: | ||
[[File:260521_moodboard.png| | [[File:260521_moodboard.png|800px]] | ||
---- | ---- | ||
'''03.06.2021''' | '''03.06.2021''' | ||
I connected the pulse sensor to Max and had some troubles on my way there. | [[File:Screenshot 2021-06-03 at 11.16.31.png|800px]] | ||
I connected the pulse sensor to Max and had some troubles on my way there. As the sensor is sensitive to light, I came up with the idea to put it inside a box. I seems to make a difference, but I am not sure. I used similar objects from the "sensing physical parameters" patch and an Arduino code I found online. | |||
<gallery> | |||
File:Pulse Sensor to Max_2 copy.jpg | |||
File:Pulse Sensor to Max_1.jpg | |||
File:Screenshot 2021-06-03 at 12.08.36.png | |||
File:Screenshot 2021-06-03 at 12.11.04.png | |||
File:Screenshot 2021-06-03 at 12.12.03.png | |||
File:PPG_forehead.png | |||
</gallery> | |||
I have no clue how to connect the input from the sensor to the webcam image. I am searching online, but I haven’t found good references / tutorial how to distort a live image with specific data input. I would be very happy for any hint. | I have no clue how to connect the input from the sensor to the webcam image. I am searching online, but I haven’t found good references / tutorial how to distort a live image with specific data input. I would be very happy for any hint. | ||
During the research, I found the method of using the webcam as the tool to measure your heart rate. I think this would be way more interesting!! [https://hackaday.com/2020/12/25/webcam-heart-rate-monitor-brings-photoplethysmography-to-your-pc/] Using the webcam as the sensor and for creating the visuals as well. | During the research, I found the method of using the webcam as the tool to measure your heart rate. I think this would be way more interesting!! [https://hackaday.com/2020/12/25/webcam-heart-rate-monitor-brings-photoplethysmography-to-your-pc/] Using the webcam as the sensor and for creating the visuals as well. | ||
What I am imagining right now, but don’t know where to start is the following: The visitor is coming closer to the screen, seeing itself inside the webcam. The pulse gets identified by the webcam and the human is recognized as well, (ideally turns red-ish; opacitiy 50%; maybe by using the cellblock object?) The pixels of the human stretches to the rhythm of the pulse. | What I am imagining right now, but don’t know where to start is the following: The visitor is coming closer to the screen, seeing itself inside the webcam. The pulse gets identified by the webcam and the human is recognized as well, (ideally turns red-ish; opacitiy 50%; maybe by using the cellblock object?) The pixels of the human stretches to the rhythm of the pulse. | ||
Line 61: | Line 58: | ||
---- | ---- | ||
'''04.06.2021''' | '''04.06.2021''' | ||
Please find here the patch: [[:File:210604_Project.maxpat]] | Please find here the patch: [[:File:210604_Project.maxpat]] | ||
I can't figure out how to create a new image / video from the edited cellblock (the edited pixels). | I was trying to manipulate the webcam image by pixels. I can't figure out how to create a new image / video from the edited cellblock (the edited pixels). | ||
Do you have a tutorial or references for that? I am wondering if a command which manipulates the whole image - not individual pixels - would be interesting. Probably that's why you recommended jit.gl.pix, but I can't make it work. | Do you have a tutorial or references for that? I am wondering if a command which manipulates the whole image - not individual pixels - would be interesting. Probably that's why you recommended jit.gl.pix, but I can't make it work. | ||
<gallery> | |||
File:manipulate webcam image.png | |||
File:Screenshot 2021-06-07 at 23.25.13.png | |||
</gallery> | |||
---- | ---- | ||
'''10.06.2021''' | '''10.06.2021''' | ||
update from the pixel manipulation | Please find here an update from the pixel manipulation. | ||
<gallery> | <gallery> | ||
Line 81: | Line 79: | ||
File:manipulate webcam image_2.png | File:manipulate webcam image_2.png | ||
</gallery> | </gallery> | ||
I was not so happy with the pixel manipulation. Online, I found an effect that I really like. It’s used for motion tracking. If the object is still, you don't see it. If it moves, you can hardly see the outline. If it moves faster, you can see more and more. | I was not so happy with the pixel manipulation. Online, I found an effect that I really like. It’s used for motion tracking. If the object is still, you don't see it. If it moves, you can hardly see the outline. If it moves faster, you can see more and more. | ||
Line 99: | Line 98: | ||
'''16.06.2021''' | '''16.06.2021''' | ||
{{#ev:youtube| | {{#ev:youtube|OIjoACkDEIc}} | ||
[ | Please find here the patches: | ||
*Mousestate [[:File:210613_Mousestate.maxpat]] | |||
*Pulse Sensor [[:File:210616_Pulse.maxpat]] | |||
Before connecting the pulse sensor to Max, I first tried to simplify it with "mousestate". | Before connecting the pulse sensor to Max, I first tried to simplify it with "mousestate". | ||
After that, I just had to connect the Arduino again. | After that, I just had to connect the Arduino again. | ||
<gallery> | |||
File:Screenshot 2021-06-16 at 22.54.26.png | |||
File:Pulse Sensor.png | |||
</gallery> | |||
Everything worked and I generated snapshots according to the pulse sensor. Unfortunately, I was not happy with the result. I still had the last patch in my mind, where the snapshot is triggered by a sound. With this, I was able to move away from the computer. With the setup of the pulse sensor, I was glued to the screen, unable to move around. | |||
---- | |||
'''17.06.2021''' | |||
{{#ev:youtube|Mha9aTXIeDU}} | |||
We perceive things that move, they trigger our attention and curiosity. We immediately perceive them as living matter. Though, when there is a sudden noise, we prick up our ears. | |||
{{#ev:youtube|S-nIkN0xIuk}} | |||
Combining these two expressive triggers, can lead to an alarming outcome, though this project aims to visualise a living organism, which can be only seen when moving and is captured by sound. Finger snapping, hand clapping, whistling, drumming, hitting, screaming, stamping, clanging, etc., are wether visible or not in the image. The sequence of the single snapshots are shown in a soundless video. As a result, the brain interprets the new sequence differently from the original webcam video and new movements are seen. | |||
Humans are able to perceive movement by motion perception. Our world consists of movement, of changes in spatial references. Motion perception accompanies us in everyday life and is important for finding our way in the world. However, we see not only real movement, but also apparent movement. Apparent movement is the perception of movement in objects that are not really moving in the physical sense. It refers to the stroboscopic movement, which is the perception of movement when viewing a sequence of slightly varied individual images. | |||
This setup encourages to move in front of the camera, | |||
to show an emotion, to make a sound, to speak out loud. | |||
'''Next step: going outside, capturing people/cars/objects/animals/plants which move and caught by random sounds of the location.''' |
edits