Puppeteering an AI - Interactive Control of a Machine-Learning based Artificial Dancer
- This paper describes the authors' first experiments in creating an artificial dancer whose movements are generated through a combination of algorithmic and interactive techniques with machine learning. This approach is inspired by the time honoured practice of puppeteering. In puppeteering, an articulated but inanimate object seemingly comes to live through the combined effects of a humanThis paper describes the authors' first experiments in creating an artificial dancer whose movements are generated through a combination of algorithmic and interactive techniques with machine learning. This approach is inspired by the time honoured practice of puppeteering. In puppeteering, an articulated but inanimate object seemingly comes to live through the combined effects of a human controlling select limbs of a puppet while the rest of the puppet's body moves according to gravity and mechanics. In the approach described here, the puppet is a machine-learning-based artificial character that has been trained on motion capture recordings of a human dancer. A single limb of this character is controlled either manually or algorithmically while the machine-learning system takes over the role of physics in controlling the remainder of the character's body. But rather than imitating physics, the machine-learning system generates body movements that are reminiscent of the particular style and technique of the dancer who was originally recorded for acquiring training data. More specifically, the machine-learning system operates by searching for body movements that are not only similar to the training material but that it also considers compatible with the externally controlled limb. As a result, the character playing the role of a puppet is no longer passively responding to the puppeteer but makes movement decisions on its own. This form of puppeteering establishes a form of dialogue between puppeteer and puppet in which both improvise together, and in which the puppet exhibits some of the creative idiosyncrasies of the original human dancer.…
Document Type: | Conference Proceeding |
---|---|
Conference Type: | Konferenzartikel |
Zitierlink: | https://opus.hs-offenburg.de/8628 | Bibliografische Angaben |
Title (English): | Puppeteering an AI - Interactive Control of a Machine-Learning based Artificial Dancer |
Conference: | Generative Art Conference (24. : 15th-17th of December 2021 : Cagliari, Sardinia, Italy/online) |
Author: | Daniel Bisig, Ephraim WegnerStaff Member |
Year of Publication: | 2021 |
Place of publication: | Rom |
Publisher: | Domus Argenia Publisher |
First Page: | 315 |
Last Page: | 332 |
Parent Title (English): | XXIV Generative Art 2021 : proceedings of XXIV GA conference |
Editor: | Celestino Soddu, Enrica Colabella |
ISBN: | 978-88-96610-43-5 |
URL: | http://www.artscience-ebookshop.com/GA2021_proceedings_web.pdf |
Language: | English | Inhaltliche Informationen |
Institutes: | Fakultät Medien (M) (ab 22.04.2021) |
Institutes: | Bibliografie |
Tag: | artificial dancer; dance and technology; deep learning; motion synthesis | Formale Angaben |
Open Access: | Open Access |
Bronze | |
Licence (German): | Creative Commons - CC BY-NC-SA - Namensnennung - Nicht kommerziell - Weitergabe unter gleichen Bedingungen 4.0 International |