GMU:In Sync/Projects/Fog Installation: Difference between revisions

From Medien Wiki
No edit summary
m (fixed wiki syntax and formatting)
 
(12 intermediate revisions by one other user not shown)
Line 1: Line 1:
PROGRAMMING:
==Programming==


One of the first things I tried to accomplish in pure data was building a simple cross fader between to moving images. I knew that I would want motion detection to be incorporated into my final installation, however I thought it better to start simple given my introduction to visual programming languages. You may load your own media into the patch.
One of the first things I tried to accomplish in pure data was building a simple cross fader between 2 moving images. I knew that I would want motion detection to be incorporated into my final installation, however I thought it better to start simple given my introduction to visual programming languages. You may load your own media into the patch.


[[File:Video Cross-Disolve.pd]]
[[Media:Video Cross-Disolve.pd]]


A second iteration of the patch utilizes the camera on a computer to sense movement and trigger the cross fade/dissolve between the two videos. Again, you may wish to load your own media into the patch.
A second iteration of the patch utilizes the camera on a computer to sense movement and trigger the cross fade/dissolve between the two videos. Again, you may wish to load your own media into the patch.


[[File:Cross-Disolve_Camera.pd]]
[[Media:Cross-Disolve Camera.pd]]


With several iterations in between, I was finally able to build a patch that performed the basic function needed for my installation. This patch utilizes an ultrasonic motion sensor and an arduino to measure the distance of of an object--the viewer--to the screen where the images or videos are projected and responds accordingly: The cross-fade is triggered by the distance of the viewer. The following is a rudimentary version of what I hope to achieve in a larger installation context that more thoroughly considers the various behaviors of the viewer. I've also included the modified ping sensor code for reference. Again, please load your own media into the patch.
With several iterations in between, I was finally able to build a patch that performed the basic function needed for my installation. This patch utilizes an ultrasonic motion sensor and an arduino microcontroler to measure the distance of of an object--the viewer--to the screen where the images or videos are projected and responds accordingly: The cross-fade is triggered by the distance of the viewer. The following is a rudimentary version of what I hope to achieve in a larger installation context that more thoroughly considers the various behaviors of the viewer. I've also included the modified ping sensor code for reference. Again, please load your own media into the patch.


[[File:Fullscreen_Final_Patch.pd]]
[[Media:Fullscreen Final Patch.pd]]
[[File:Ping_Sensor_Code.pdf]]
[[Media:Ping Sensor Code.pdf]]


PROOF OF CONCEPT
==Proof of Concept==


The two videos below demonstrate both the implementation of the installation concept and artistic techniques. The first documents physical interaction with a the screen using my final pure data patch. The second records an experiment shot on at a high frame rate that utilizes steam and a macro lens.
The media below is both a demonstration of artistic technique and the implementation of the installation concept. The first video documents physical interaction with the screen using my pure data patch. The second records an experiment with steam shot at a high frame rate and using a macro lens. The pictures are a preliminary mock up of the installation made using Cinema 4D. An early illustration of the installation can be found below as well.


[[File:Video_Example_1.zip]]
[[Media:Video Example 1.zip]]
[[File:Video_Example_2.zip]]
[[Media:Video Example 2.zip]]


[[File:still_1.jpg]] [[File:Still_2.jpg]]
[[File:still 1.jpg]] [[File:Still 2.jpg]] [[File:Still 3.jpg]] [[File:Still 4.jpg]] [[File:Still 5.jpg]]
 
[[File:Installation sketch.jpg]]
 
==Next Steps==
 
Going forward, I will continue to conduct experiments using a fog machine, oscillating ultrasonic humidifier, and a vacuum. Concurrently, I'm doing a bit of research on cloud chambers as well. [http://www.nuffieldfoundation.org/sites/default/files/files/cloud_chamber.mov]. As an aside, Dutch artist Berndnaut Smilde recently generated artificial nimbus clouds in an exhibition space by regulating the temperature and humidity of the space.[http://www.washingtonpost.com/blogs/arts-post/post/artist-berndnaut-smilde-creates-indoor-clouds/2012/03/13/gIQA7yAT9R_blog.html] My ultimate goal is to be able to control the density and movement of the fog based on the movement of the viewer in the installation space.

Latest revision as of 23:16, 29 July 2012

Programming

One of the first things I tried to accomplish in pure data was building a simple cross fader between 2 moving images. I knew that I would want motion detection to be incorporated into my final installation, however I thought it better to start simple given my introduction to visual programming languages. You may load your own media into the patch.

Media:Video Cross-Disolve.pd

A second iteration of the patch utilizes the camera on a computer to sense movement and trigger the cross fade/dissolve between the two videos. Again, you may wish to load your own media into the patch.

Media:Cross-Disolve Camera.pd

With several iterations in between, I was finally able to build a patch that performed the basic function needed for my installation. This patch utilizes an ultrasonic motion sensor and an arduino microcontroler to measure the distance of of an object--the viewer--to the screen where the images or videos are projected and responds accordingly: The cross-fade is triggered by the distance of the viewer. The following is a rudimentary version of what I hope to achieve in a larger installation context that more thoroughly considers the various behaviors of the viewer. I've also included the modified ping sensor code for reference. Again, please load your own media into the patch.

Media:Fullscreen Final Patch.pd Media:Ping Sensor Code.pdf

Proof of Concept

The media below is both a demonstration of artistic technique and the implementation of the installation concept. The first video documents physical interaction with the screen using my pure data patch. The second records an experiment with steam shot at a high frame rate and using a macro lens. The pictures are a preliminary mock up of the installation made using Cinema 4D. An early illustration of the installation can be found below as well.

Media:Video Example 1.zip Media:Video Example 2.zip

Still 1.jpg Still 2.jpg Still 3.jpg Still 4.jpg Still 5.jpg

Installation sketch.jpg

Next Steps

Going forward, I will continue to conduct experiments using a fog machine, oscillating ultrasonic humidifier, and a vacuum. Concurrently, I'm doing a bit of research on cloud chambers as well. [1]. As an aside, Dutch artist Berndnaut Smilde recently generated artificial nimbus clouds in an exhibition space by regulating the temperature and humidity of the space.[2] My ultimate goal is to be able to control the density and movement of the fog based on the movement of the viewer in the installation space.