99
edits
No edit summary |
No edit summary |
||
(5 intermediate revisions by 2 users not shown) | |||
Line 1: | Line 1: | ||
1 FEBRUARY 2021 | |||
Final patch: | |||
[[:File:210201_ultrasonic_drawing.maxpat]] | |||
'''Project Summary''' | |||
''Sounding Landshapes'' – digital drawings of objects as sensed with sound. Data collected by scanning objects repeatedly over time with a self-devised ultrasonic sensor device. | |||
Sounding is traditionally a technique by which a boat, outfitted with a sonar device, floats atop the surface of a body of water and measures its depths. Meanwhile, sonography is a method that uses sound to read and image surfaces, most commonly in medicine and geology. In this work, I am interested in sounding as the action and sonography in the sense of its literal etymological meaning “drawing with sound.” | |||
I have chosen sound over other higher-precision options, such as lasers, because I am interested in sound as a specifically Earth-bound medium – sound cannot travel through the void of outer space, so it is a particularly Earthly medium; it is explicitly terrestrial. This interests me when trying to (re)connect to the Earth through artistic practice, using sensors beyond my own body’s senses. | |||
After having made a series of 2D hypothetical sounding studies using ink on paper, I have built my own self-devised sonar tool in the hope of extending the hypothetical 2D sounding studies of my drawings into the 3D space of the “real” world. For the Max/MSP aspect of the project, I mounted the ultrasonic sensor device on my hand and passed it over objects at a steady pace in a level line, so I could still make sense of the data while also having more freedom for imperfection and gesture over a duration of time. So, to take readings, I move my hand back and forth like a scanner over the surface of whatever object I put on the ground. The sensor obtains a reading by emitting regular ultrasonic clicks that hit the surface of the object and then bounce back to the device. The data collected during this process is sent into Max/MSP and used to reconstitute the form of the object in digital space (via a jit.lcd object). The resulting image is the object as seen through sound. Below is my process over several months to develop a Max/MSP patch to this end, along with the resulting images. | |||
25 JANUARY 2021 | 25 JANUARY 2021 | ||
Line 11: | Line 29: | ||
A new question has arisen: How do I make it so that every time the data input restarts (after having been stopped), it restarts at the far left of the window rather than in the middle? You can see in my video that when I restart the data input, the lines start from the middle of the window, not the far left. This makes things too messy when I want to start a fresh image. | |||
<gallery> | |||
File:Screen Shot 2021-01-27 at 3.43.57 PM.png | |||
</gallery> | |||
Below are my most recent ultrasonic scans of a stool. | |||
[[File:Screen Shot 2021-01-20 at 23.49.23.png|400px]] | [[File:Screen Shot 2021-01-20 at 23.49.23.png|400px]] |
edits