The Light Arrays project explores the extension of the body through an array of visible light beams projecting on the environment a dynamic representation of the body, its movement and posture. Interestingly, these light cues are visible both for the user wearing the device as well as for others. This feature points to two interesting lines of research:
- Augmented Proprioception generated with an artificial visual feedback system. This can be useful for learning complex somatic techniques, speeding-up rehabilitation, as well as exploring the body’s expressive capabilities.
- Enhanced body interaction prompted by an interactively augmented body image (in time and space), as well as a clear visual representation of interpersonal space.
This system complements – and to a certain extent functions as the exact reverse – of the Haptic Radar system, in which rangefinders were used to extend spatial awareness through vibrotactile feedback. Indeed, rather than gathering information on the objects surrounded the wearer and transducing it into tactile cues, the Light Arrays system gathers information about the wearer’s posture, and projects this information onto the surrounding for everybody to observe.
We are exploring several embodiments of the Light Arrays using laser modules, servo motors, and sensors (either worn or external). Both direction and intensity of the laser beams are modified according to the motion of the wearer, or in response to the motion of a second person. This creates an interesting interaction scenario in which the extended body may be shared between two persons. In the in-visible skirt prototype shown in the figures, each of the 12 laser modules (635nm, 3mW) attach to a flexible circular support that can be deformed and rotated thanks to a set of four servo motors. A microcontroller (ATmega168) maps sensor data comming from a second wearable “controller” into different meaninful servo positions. An elementary mapping demonstrated in this video shows forward/backward or left/right bending postures mapped as similar motions of a light-based skirt. A set of three separated battery sources is used to drive the servos, the lasers and the micrcontroller. Data is sent wirelessly through an XBee 2.5 Znet network capable of transmitting raw data at a rate of 30Hz, or coded commands at a lower speed. At the same time, sensor data is sent to an external desktop computer that will be helpful in designing interesting new mappings and analysing the data.
- Video of the performance and installation at the Yebisu International Festival of Art and Alternative Visions (18-27/2/2011): [WMV-64MB]
- Video Demo of “in-visible skirt” prototype (first tests) [WMV, 33MB], [MP4, 8MB]
- Invisible skirt prototype: details of laser motion: [MOV-60MB]
- Wilde D., Cassinelli A.., Zerroug A.:LightArrays, ACM CHI’12 (Interactivity), May 5-10, 2012, Austin, Texas, USA (2012) [PDF-4.2MB] [wmv-46MB]
- Wilde, D., Cassinelli, A., Zerroug, A., Helmer, R J N., Ishikawa, M. Light Arrays: a system for extended engagement. Proc. ICDVRAT with ArtAbilitation Viña del Mar/Valparaíso, Chile. September 2010 [PDF-2MB]
- Yebisu International Festival of Art and Alternative Visions (18-27/2/2011) [website][Blog], Panel presentation [PDF-37MB]