Learning Embodied Sound-Motion Mappings

Authors

Wallace, Benedikte
Martin, Charles P.
Tørresen, Jim
Nymoen, Kristian

Journal Title

Journal ISSN

Volume Title

Publisher

Association for Computing Machinery (ACM)

Access Statement

Research Projects

Organizational Units

Journal Issue

Abstract

Through dance, a wide range of emotions can be expressed. As virtual agents and robots continue to become part of our daily lives, the need for them to efficiently convey emotion and intent increases. When trained to dance, to what extent can AI learn to model the tacit mappings between sound and motion? Here, we explore the creative capacity of a generative model trained on 3D motion capture recordings of improvised dance. We perform a perceptual judgment experiment wherein respondents rate movement generated by our model as well as human performances. While the sound-motion mappings remain somewhat elusive, particularly when compared to examples of human dance, our study shows that in certain aspects related to perceived dance-likeness and expressivity, the model successfully mimics human dance movement. By employing a perceptual study to evaluate our generative model, we aim to further our ability to understand the affordances and limitations of creative AI.

Description

Citation

Source

Book Title

C and C 2021 - Proceedings of the 13th Conference on Creativity and Cognition

Entity type

Publication

Access Statement

License Rights

Restricted until