Beyond Mapping: Exploring Spatial Musical Interface Design with Augmented Reality
Abstract
This thesis investigates the design of spatial interfaces for musical expression in head-mounted augmented reality (AR) systems, focusing on how such interfaces can support real-time performance, collaboration between musicians, and co-creation with artificial intelligence (AI). In recent years, AR has evolved into a spatial computing medium that enables embodied, multisensory interaction, opening new possibilities for creative expression. In digital musical instrument (DMI) design, integrating AR expands the traditional input-output mapping paradigm by enabling more fluid and collaborative forms of musical interaction. In this research, spatial musical interfaces in head-mounted AR systems are designed through three key dynamics: emergent through freehand interaction, situated through collaborative performance, and mediated through human-AI co-creation. These systems were evaluated through formal studies with musicians to validate design principles and deepen the understanding of how AR reshapes musical experience. A mixed-methods approach, combining the first-person design through performance and formal user evaluations, guides the step-by-step progression from exploration to validation.
Throughout this thesis, three spatial musical systems are introduced and studied: (1) a freehand-enabled spatial interface for individual expression, developed iteratively through creative practice; (2) a collaborative system designed to support mutual awareness and engagement, evaluated through a factorial design varying real-time visualisation conditions; (3) an extension of the collaborative system for human-AI co-creation, developed through auto-ethnographic exploration of system configuration and performance topologies. The first study, involving twenty musicians, evaluated the freehand interface's usability, playability, and musical experiences. The results revealed how AR's spatial mobility, interaction and sonic integration enabled new expressive possibilities. The second study with four pairs of musicians examined usability, workload, situational awareness and mutual engagement on the collaborative system. The findings revealed that visualising bodily and instrument actions improved mutual awareness and fostered cohesive musical interaction. The final study with sixteen musicians introduced an AI collaborator. By varying the number of human musicians in group improvisations, the findings demonstrated that transparent visualisation improved musicians' understanding of the AI collaborator's contribution, while the presence of a second human performer reshaped co-creative dynamics and meaning-making.
Overall, this research presents a series of spatial musical interface designs for head-mounted AR systems, advancing novel musical experiences in individual, collaborative, and co-creative contexts. It demonstrates that the design of novel musical expression in AR emerges not from fixed mappings, but from dynamic, situated, and mediated relationships shaped by spatial interaction and social engagement.
Description
Keywords
Citation
Collections
Source
Type
Book Title
Entity type
Access Statement
License Rights
Restricted until
Downloads
File
Description
Thesis Material