peripheralfocus

bark rubbings: city as forest

a site specific, mobile, visuo-tactile installation



In bark rubbings, I have imagined the city as a forest of textures. A walk through a forest is an immersive experience of living architecture. It is a medium that forges connections across senses and scales, composed of varying patterns and densities of material life.

It is possible to view the city in a similar fashion. Certain areas are more or less dense with buildings and pedestrians. The buildings' canopy is higher or lower depending upon use or neighborhood. Looking upon individual buildings, each facade has its own unique texture. For bark rubbings, images of the city's architectural facades have been converted to sound and then optimized for vibrotactile feedback. Wearing the vest and walking through the city, one can experience the visual textures of these built structures as they project out into the spaces that surround them as vibrotactile patterns.



video

bark rubbings: a locative / wearable / tactile experience from Erik Conrad on Vimeo.

news

bark rubbings: city as forest was exhibited as part of PIXILERATIONS [v.6] September 24-October 11. Pixilerations is a showcase of new media art and interactive performance and part of the FirstWorks Festival 2009, Providence, Rhode Island.



locations

bark rubbings took place in downtown Providence, RI, starting and ending at RISD's Sol Koffler Gallery.



how it works

The system comprised of the Wealthy Rechargeable 8-Motor Massage Vest Massager coupled with an HP iPaq 5900 running mscape and some custom hardware based on this circuit from afrotechmods that transduces sound to vibration. This circuit was modified a bit to suit the needs of the project, for example allowing for stereo input in order to have two channels of vibration.

To create the patterns of vibration, I first took snapshots of prominent (or visuo-tactily interesting) building around downtown Providence, such as City Hall in Kennedy Plaza:


Then, I ran the images through some patches (software) that I wrote in Max/MSP/Jitter. The first patch adjusts the brightness and contrast and samples a row of pixels from the image to create a simplified abstract representation:


The second step takes the modified image (above) and translates it into a loop of frequencies that will be used to drive the motor.

In between the blips and boops that one can hear are a range of sub-audible frequencies that are suited for vibrating motor control.

The abstracted images and the mp3s generated from them are then geolocated with mscape to play back/display when the participant walks through the corresponding area. While the images are displayed on the screen of the iPaq, the sounds are not heard. They are sent via stereo-mini cable to the electronics in the vest, which convert the sound into two channels of variable vibration in the vest, allowing the participant to feel the tactile variations in the facades on their body.

background

Please refer to palpable city and my Tactilist Theatre for more information about the aesthetics of touch.




How to display a flying dragon, from Johann Kestler, Physiologia Kircheriana Experimentalis, p. 247. from kircher.stanford.edu/gallery

© Erik Conrad 1998-2009