Sonification / Cymatics

Sonification

Using data extracted from digital images with the open source Processing programming environment, I explore alternate aesthetic experiences of these images using sound.  The process of sonification in this manner offers opportunities for the montage of image and sound to develop interesting narratives and poetic juxtapositions that might be considered a form of Gene Youngblood’s synaesthetic cinema.

Find below descriptions of sonification instruments involving image translation and links to examples of them in use.

ImageFreq is an application created in Processing that allows the user to load images into it and use those images to generate sine wave audio frequencies.  To create the frequencies the visible spectrum is overlaid on the audible spectrum. The red, green and blue values for each pixel generate frequencies that fall between 3 discreet ranges that are equivalent to their relative position on the visible spectrum.  Red is lower in tone, whereas blue is very high pitched.

For each of four images, the interface allows the user to decide how frequently pixels are sampled, the relative volume of each image, and the range of the audible spectrum to use for each image.  Different images create different tones/melodies.  Each image can be sampled to provide bass, melody or even percussion by using the pitch slider and the “hold” button (a.k.a sustain).

MidiSight developed out of ImageFreq and is still under development (as of July, 2010).  Instead of generating sine waves MidiSight uses Midi messages to play instruments based on pixel color.  Again using the red, green, and blue color channels, 3 separate instruments may be played.  Sliders below the single image allow you to select the instrument to be used for each channel, and how quickly it plays each note.

.

Electronic Cymatics

The videos that follow are examples of electronic cymatics. Electronic cymatics is a process by which sound signals are married to control voltage to create abstract imagery using the technology of analog video processing, as opposed to traditional cymatics which is the generation of visuals using the physical vibration of sound.

The videos presented here were generated live as part of a residency at the Experimental Television Center in Owego, NY in June 2010. I am currently in the process of developing methods to incorporate equivalent imagery into live performances using a circuit bent Casio SK-1 sampler and a consumer grade video mixer. Stay tuned for more videos of this process as it develops.

Mixing interests in electronic cymatics and video appropriation I have also created a newly improvised audio AND visual score for the movie Transformers 2.

Electronic Cymatic Experiment no.3 from Lee Montgomery on Vimeo.

Electronic Cymatic Experiment No. 2 from Lee Montgomery on Vimeo.

Electronic Cymatic Experiment No.1 from Lee Montgomery on Vimeo.