GMU:I and my Max/Elizabeth McTernan: Difference between revisions

From Medien Wiki
No edit summary
No edit summary
Line 9: Line 9:
I have chosen sound over other higher-precision options, such as lasers, because I am interested in sound as a specifically Earth-bound medium – sound cannot travel through the void of outer space, so it is a particularly Earthly medium; it is explicitly terrestrial. This interests me when trying to (re)connect to the Earth through artistic practice, using sensors beyond my own body’s senses.
I have chosen sound over other higher-precision options, such as lasers, because I am interested in sound as a specifically Earth-bound medium – sound cannot travel through the void of outer space, so it is a particularly Earthly medium; it is explicitly terrestrial. This interests me when trying to (re)connect to the Earth through artistic practice, using sensors beyond my own body’s senses.


After having made a series of 2D hypothetical sounding studies using ink on paper, I have built my own self-devised sonar tool in the hope of extending the hypothetical 2D sounding studies of my drawings into the 3D space of the “real” world. At this stage of development, the ultrasonic sensor is mounted on a customized linear actuator and moves back and forth like a scanner over the surface of whatever object I put under it. It obtains a reading by emitting regular ultrasonic clicks that hit the surface of the object and then bounce back to the device. A programmed stepper motor ensures that the movement is steady and predictable so that the data is meaningful.
After having made a series of 2D hypothetical sounding studies using ink on paper, I have built my own self-devised sonar tool in the hope of extending the hypothetical 2D sounding studies of my drawings into the 3D space of the “real” world. For the Max/MSP aspect of the project, I mounted the ultrasonic sensor device on my hand and passed it over objects at a steady pace in a level line, so I could focus on an object and would have more freedom for imperfection and gesture over a duration of time. So, to take readings, I move my hand back and forth like a scanner over the surface of whatever object I put under it. It obtains a reading by emitting regular ultrasonic clicks that hit the surface of the object and then bounce back to the device. The data collected during this process is sent into Max/MSP (via a jit.lcd object) and used to reconstitute the form of the object in digital space, as seen through sound. Below is my process over several months to develop a Max/MSP patch to this end, along with the resulting images.
 
The device records the distances between the mounted sensor and the surface/object, so the data collected during this process can then be used to reconstitute the form of the object in digital space or in analogue drawings, as seen through sound. I constructed some simple geometrical 3D shapes out of heavy paper and have “sounded” them with my machine – they are intentionally similar to the 2D shapes I used in my preceding drawings. I am currently taking a class in Max/MSP to have more options for visualizing and animating the data I collect.