Dancing with Robots
Overview
“I didn’t want to imitate anybody. Any movement I knew, I didn’t want to use.” Eminent modern dance choreographer Pina Bausch felt the same ache that has pierced artists of all generations – the desire to generate something truly original, yet still somehow sourced from one’s own body and mind.
Recent technologies to capture human motion as well as to analyze and predict stylistically-related kinetic sequences using machine learning have opened provocative new possibilities in the domain of movement generation. This project fuses expertise in machine learning, dance practice and choreographic theory, experimental particle physics, and the humanities to create open-source tools for multifaceted applications in dance and movement research.
Unlike other Recurrent Neural Network (RNN)-based generative dance models introduced in recent publications, our models are designed to serve as beginner-friendly, open-source software resources for dance artists who wish to incorporate machine learning into their practice. Generative adversarial networks (GANs) have recently shown compelling results in image style transfers given a pose as a prompt, but until now have not been used to generate sequences of original movements.
In this project, we introduce a suite of configurable, publicly-accessible tools for generating dance using both RNNs and GANs. These methods have been developed using improvisational dance from one of the authors herself, recorded using a state-of-the-art motion capture system with a rich density of datapoints representing the human form. With this toolset, we equip dance artists with strategies to tackle the challenge Bausch faced in her own work: generating truly novel movements with both structure and aesthetic meaning.