Volltext-Downloads (blau) und Frontdoor-Views (grau)
  • search hit 3 of 1253
Back to Result List

Puppeteering AI - Interactive Control of an Artificial Dancer

  • Generative machine learning models for creative purposes play an increasingly prominent role in the field of dance and technology. A particularly popular approach is the use of such models for generating synthetic motions. Such motions can either serve as source of ideation for choreographers or control an artificial dancer that acts as improvisation partner for human dancers. Several examplesGenerative machine learning models for creative purposes play an increasingly prominent role in the field of dance and technology. A particularly popular approach is the use of such models for generating synthetic motions. Such motions can either serve as source of ideation for choreographers or control an artificial dancer that acts as improvisation partner for human dancers. Several examples employ autoencoder-based deep-learning architectures that have been trained on motion capture recordings of human dancers. Synthetic motions are then generated by navigating the autoencoder's latent space. This paper proposes an alternative approach of using an autoencoder for creating synthetic motions. This approach controls the generation of synthetic motions on the level of the motion itself rather than its encoding. Two different methods are presented that follow this principle. Both methods are based on the interactive control of a single joint of an artificial dancer while the other joints remain under the control of the autoencoder. The first method combines the control of the orientation of a joint with iterative autoencoding. The second method combines the control of the target position of a joint with forward kinematics and the application of latent difference vectors. As illustrative example of an artistic application, this latter method is used for an artificial dancer that plays a digital instrument. The paper presents the implementation of these two methods and provides some preliminary results.show moreshow less

Download full text files

  • CHI2022-BisigWegner.pdf
    eng

Export metadata

Statistics

frontdoor_oas
Metadaten
Document Type:Conference Proceeding
Conference Type:Konferenzartikel
Zitierlink: https://opus.hs-offenburg.de/8627
Bibliografische Angaben
Title (English):Puppeteering AI - Interactive Control of an Artificial Dancer
Conference:Generative AI and Computer Human Interaction Workshop at the 2022 Conference on Human Factors in Computing Systems (May 10, 2022 : New Orleans, LA)
Author:Daniel Bisig, Ephraim WegnerStaff Member
Year of Publication:2022
Contributing Corporation:ACM
First Page:1
Last Page:6
URL:https://www.researchgate.net/publication/360950859
Language:English
Inhaltliche Informationen
Institutes:Fakultät Medien (M) (ab 22.04.2021)
Institutes:Bibliografie
Tag:artificial dancer; dance and technology; deep learning; motion synthesis
Formale Angaben
Open Access: Open Access 
 Grün 
Licence (German):License LogoCreative Commons - CC BY-NC-SA - Namensnennung - Nicht kommerziell - Weitergabe unter gleichen Bedingungen 4.0 International