MusieGlove & Body 2.0

 

Abstract: MusieGlove and Body 2.0 were built to create an integrated system that allows dancers to create motion-based generative music. The project has been designed to give the user full control, and to be in service of the choreographer’s ideas and of his ideas’ generation process. A fully functioning system with the core design aspects was developed and a public performance employing it was given.

Classification: Interaction Design, Experience Design, Hardware and Software Prototyping

 
 
 
 

Context and Research Question

The project originated from conversations with dancers I’ve worked with throughout the years. The question that was naturally raised was the following: is it possible to design a system that puts the dancer in control of the music?

The inquiry was motivated by the fact that choreography, considered as an artistic institution, operates, and has always operated, by considering the musical medium as fixed. Choices regarding the movements of the body have to be made to follow this medium.

The ubiquitousness of sensing technologies made this practice seem as antiquated to our eyes, and therefore prone to innovation. The idea was thus to put the body in a position of full control, and treating the music as a ‘fluid’ parameter to be controlled via its movements.

There have been incredible efforts in the past 15-20 years to incorporate different interactive technologies to ‘embellish’ the visual aspects of a production. Little effort and thought had been given though to develop a system for an interactive musical work, as properly integrated in a dance production.

In a couple of months a fully functioning system was developed and a public performance employing it was given. The project was quickly prototyped with technologies that did not offer optimal solutions in terms of accuracy and performance, thus has to be here considered as a fully functioning proof-of-concept (i.e. not market-ready).

General System Design

The system developed makes use of the following elements:

  1. Custom-built wearable device for touch controls and detailed hand movement sensing;
  2. Infra-red sensing for detecting overall body and limbs movements;
  3. Software in charge of all sonic mappings and other performance controls.

Interaction Design

With the aforementioned constituents, the system has been designed to allow for a physical modality of interaction based on two sub-modalities: tactile and motion-based. Both are mapped to auditory commands, with the former allowing control over detailed sonic constituents, and the latter to allow control over the overall sonic structures.

Body-slide1.jpg

Implementation I: The MusieGlove

In order to allow for non-trivial choreographic needs some trigger-like controls and detailed hand motion tracking seemed inevitable. This consideration is what led to the decision in the design process of developing a wearable device that would allow to meet such needs.

The hardware was prototyped with an Arduino board which allowed for a simple and quick to implementation process: it included four push button switches, a MEMS gyro and accelerometer (placed on the wrist) and a wireless module. They were assembled on top of a black polyester glove.

 

Here is a clip of testing the device after it was built. Data was sent to custom built software developed via the visual scripting language max/msp.

 

As it can be seen in the video the four buttons can easily be triggered via a pinching motion with the thumb.

Implementation II: Infrared Sensing and Software

A Microsoft Kinect has been employed to detect the user’s limbs’ motions. Overall body position within the stage, left and right arms and legs are tracked.

The overall data input, from both the Glove and Kinect, is sent to software prototyped with MaxMsp. Here are its main architecture elements:

  1. One push buttons is mapped to the set function of starting and transitioning between ‘sonic spaces’;
  2. A ‘sonic space’ is initialized playing collection of different audio files simultaneously; 
  3. Each file is controlled by a different variable of the data stream (limbs, wrist, push buttons) via a unique DSP (filter, volume, panning principally);
  4. Audio files controlled by the limbs motion and by overall center of body are characterized by possessing very long transients (the audio may continue indefinitely if no transition is operated); push buttons are in of control short-transient ‘sound effects’;
  5. Each sonic space is time-dependent and evolves with an elapsed time variable.
Body-slide3.jpg

Performance Result and Future Possibilities

A choreography that involved a mixed of structured movement and improvisation was adopted in the public performance. The implemented system gave the dancer full control during her creative process. 

The movements chosen were heavily dependent upon the possible control she had over each sonic space. This suggests of having, on the software side, a fully customizable interface to fit the dancer’s wants, and have a large selection of musical controls and templates. 

A market-ready product might potentially be commercially viable with such implementations and freedoms for the end-consumer, as I found, after preliminary market research, that this space has virtually not been explored yet.