ANU Open Research Repository has been upgraded. We are still working on a few minor issues, which may result in short outages throughout the day. Please get in touch with repository.admin@anu.edu.au if you experience any issues.
 

Tracking Ensemble Performance on Touch-Screens with Gesture Classification and Transition Matrices

Date

2015-05-31

Authors

Martin, Charles
Gardner, Henry
Swift, Ben

Journal Title

Journal ISSN

Volume Title

Publisher

New Interfaces for Musical Expression

Abstract

We present and evaluate a novel interface for tracking ensemble performances on touch-screens. The system uses a Random Forest classifier to extract touch-screen gestures and transition matrix statistics. It analyses the resulting gesture-state sequences across an ensemble of performers. A series of specially designed iPad apps respond to this real-time analysis of free-form gestural performances with calculated modifications to their musical interfaces. We describe our system and evaluate it through cross-validation and profiling as well as concert experience.

Description

Keywords

mobile music, ensemble performance, machine learning, transition matrices, gesture, computer music, new interfaces for musical expression

Citation

C. Martin, H. Gardner, and B. Swift, “Tracking ensemble performance on touch-screens with gesture classification and transition matrices,” in Proceedings of the International Conference on New Interfaces for Musical Expression, Baton Rouge, Louisiana, USA, 2015, pp. 359-364.

Source

Type

Conference paper

Book Title

Proceedings of the International Conference on New Interfaces for Musical Expression

Entity type

Access Statement

Open Access

License Rights

DOI

Restricted until