My capture system consists of a Microsoft Kinect v2 and Max patch to record OSC data streaming from the Kinect.
For development purposes, climber skeleton articulation data was captured to a text file. This replay-able articulation data functions as a mock climber input source to facilitate synth engine development without a climber present.
Once the synth engine is developed, Climber Synth will be able to offer real-time sonic feedback to climbers performing on the tread wall or on any climbing route.
1. Develop synth engine based on skeleton articulation data
Pertinent questions for research:
1. How could sonic feedback facilitate performance on the climbing wall? What kind of performative movement will be inspired?
2. How could sonic feedback provide real-time “coaching” for a climber’s technical form?