Robotics laboratory to reactive dance space

I was invited to Örebro University in Sweden as part of a two day lab working on a dance and motion project as part of the Music Technology Festival (MTF – https://musictechfest.net).  This was the performance piece we created.

Concept

Exploring how the opacity of technology effects our human to human communication. Touching upon our assumptions of invisible technology that we think we are in control of, how it inserts its own agendas into our communication and how challenging the boundaries of visibility of the black box can empower our decisions and usage.

Two dancers move around the space exploring controlling audio through movement. They develop a form of communication through using their movements and proximity to speak to each other. There is a black box in the center of the performance space which is a dead zone for all sensors, its the only place in the room where no sensors readings will exist. Within the black box two people control how the sensor data gets turned into sound, the hidden layer controlling the conversation. The dancers start to realize the presence and control of the black box in their communication. The dancers decide to enter the black box and take control of the technology generating their communication. At this point the entire audience is encourage into the space to move and explore the technology in the black box now controlled by the dancers.

Materials: Dancers, Motion Capture, Gyroscope and accelerometer sensors, Speakers, Software & Live Coding.

Project details

We used a motion capture rig to capture the number of people in the space, their locations and their proximity to each other. The dancers also wore accelerometer sensors on their arms providing absolute position information. The audio controlled through movement was from the Stanford DAMP Karaoke data-set (https://ccrma.stanford.edu/damp/) primarily used to study pronunciation.

All sensors fed into a router and then on to two computers inside the “Black Box” where myself and Andreas sit turning the sensor data into sound. Andreas was using CSound and I was using Sonic Pi with some live-coding playing with the sensor data and samples. The black box area was created through precisely mapping an area in the motion capture where data was never sent. Hence creating a dead zone for all sensors.

The project team was:

  • Andreas Bergsland / Norwegian University of Technology and Science / Norway
  • Joseph Wilk / South West Creative Technology Automation fellow / UK
  • Kirsi Mustalahti / ACCAC Global / Finland
  • Lilian Jap / Software developer / Sweden
  • Students and teachers of the Robotics Lab Örebro