ANU Open Research Repository has been upgraded. We are still working on a few minor issues, which may result in short outages throughout the day. Please get in touch with if you experience any issues.

Tracking Ensemble Performance on Touch-Screens with Gesture Classification and Transition Matrices




Martin, Charles
Gardner, Henry
Swift, Ben

Journal Title

Journal ISSN

Volume Title


New Interfaces for Musical Expression


We present and evaluate a novel interface for tracking ensemble performances on touch-screens. The system uses a Random Forest classifier to extract touch-screen gestures and transition matrix statistics. It analyses the resulting gesture-state sequences across an ensemble of performers. A series of specially designed iPad apps respond to this real-time analysis of free-form gestural performances with calculated modifications to their musical interfaces. We describe our system and evaluate it through cross-validation and profiling as well as concert experience.



mobile music, ensemble performance, machine learning, transition matrices, gesture, computer music, new interfaces for musical expression


C. Martin, H. Gardner, and B. Swift, “Tracking ensemble performance on touch-screens with gesture classification and transition matrices,” in Proceedings of the International Conference on New Interfaces for Musical Expression, Baton Rouge, Louisiana, USA, 2015, pp. 359-364.



Conference paper

Book Title

Proceedings of the International Conference on New Interfaces for Musical Expression

Entity type

Access Statement

Open Access

License Rights


Restricted until