Tracking Ensemble Performance on Touch-Screens with Gesture Classification and Transition Matrices
Martin, Charles; Gardner, Henry; Swift, Ben
Description
We present and evaluate a novel interface for tracking ensemble performances on touch-screens. The system uses a Random Forest classifier to extract touch-screen gestures and transition matrix statistics. It analyses the resulting gesture-state sequences across an ensemble of performers. A series of specially designed iPad apps respond to this real-time analysis of free-form gestural performances with calculated modifications to their musical interfaces. We describe our system and evaluate it...[Show more]
dc.contributor.author | Martin, Charles | |
---|---|---|
dc.contributor.author | Gardner, Henry | |
dc.contributor.author | Swift, Ben | |
dc.date.accessioned | 2016-06-06T05:58:37Z | |
dc.date.available | 2016-06-06T05:58:37Z | |
dc.identifier.citation | C. Martin, H. Gardner, and B. Swift, “Tracking ensemble performance on touch-screens with gesture classification and transition matrices,” in Proceedings of the International Conference on New Interfaces for Musical Expression, Baton Rouge, Louisiana, USA, 2015, pp. 359-364. | |
dc.identifier.issn | 2220-4806 | |
dc.identifier.uri | http://hdl.handle.net/1885/102045 | |
dc.description.abstract | We present and evaluate a novel interface for tracking ensemble performances on touch-screens. The system uses a Random Forest classifier to extract touch-screen gestures and transition matrix statistics. It analyses the resulting gesture-state sequences across an ensemble of performers. A series of specially designed iPad apps respond to this real-time analysis of free-form gestural performances with calculated modifications to their musical interfaces. We describe our system and evaluate it through cross-validation and profiling as well as concert experience. | |
dc.publisher | New Interfaces for Musical Expression | |
dc.relation.ispartof | Proceedings of the International Conference on New Interfaces for Musical Expression | |
dc.rights | Copyright remains with the authors. | |
dc.source.uri | http://www.nime.org/proceedings/2015/nime2015_242.pdf | |
dc.subject | mobile music | |
dc.subject | ensemble performance | |
dc.subject | machine learning | |
dc.subject | transition matrices | |
dc.subject | gesture | |
dc.subject | computer music | |
dc.subject | new interfaces for musical expression | |
dc.title | Tracking Ensemble Performance on Touch-Screens with Gesture Classification and Transition Matrices | |
dc.type | Conference paper | |
dc.date.issued | 2015-05-31 | |
local.publisher.url | http://nime.org | |
local.type.status | Published Version | |
local.contributor.affiliation | Martin, C., Research School of Computer Science, The Australian National University | |
local.contributor.affiliation | Gardner, H., Research School of Computer Science, The Australian National University | |
local.contributor.affiliation | Swift, B., Research School of Computer Science, The Australian National University | |
local.bibliographicCitation.startpage | 359 | |
local.bibliographicCitation.lastpage | 364 | |
dcterms.accessRights | Open Access | |
Collections | ANU Research Publications |
Download
Items in Open Research are protected by copyright, with all rights reserved, unless otherwise indicated.
Updated: 19 May 2020/ Responsible Officer: University Librarian/ Page Contact: Library Systems & Web Coordinator