MusE-XR: musical experiences in extended reality to enhance learning and performance

dc.contributor.authorJohnson, David
dc.contributor.supervisorTzanetakis, George
dc.contributor.supervisorDamian, Daniela
dc.date.accessioned2019-07-23T18:14:29Z
dc.date.available2019-07-23T18:14:29Z
dc.date.copyright2019en_US
dc.date.issued2019-07-23
dc.degree.departmentDepartment of Computer Scienceen_US
dc.degree.levelDoctor of Philosophy Ph.D.en_US
dc.description.abstractIntegrating state-of-the-art sensory and display technologies with 3D computer graphics, extended reality (XR) affords capabilities to create enhanced human experiences by merging virtual elements with the real world. To better understand how Sound and Music Computing (SMC) can benefit from the capabilities of XR, this thesis presents novel research on the de- sign of musical experiences in extended reality (MusE-XR). Integrating XR with research on computer assisted musical instrument tutoring (CAMIT) as well as New Interfaces for Musical Expression (NIME), I explore the MusE-XR design space to contribute to a better understanding of the capabilities of XR for SMC. The first area of focus in this thesis is the application of XR technologies to CAMIT enabling extended reality enhanced musical instrument learning (XREMIL). A common approach in CAMIT is the automatic assessment of musical performance. Generally, these systems focus on the aural quality of the performance, but emerging XR related sensory technologies afford the development of systems to assess playing technique. Employing these technologies, the first contribution in this thesis is a CAMIT system for the automatic assessment of pianist hand posture using depth data. Hand posture assessment is performed through an applied computer vision (CV) and machine learning (ML) pipeline to classify a pianist’s hands captured by a depth camera into one of three posture classes. Assessment results from the system are intended to be integrated into a CAMIT interface to deliver feedback to students regarding their hand posture. One method to present the feedback is through real-time visual feedback (RTVF) displayed on a standard 2D computer display, but this method is limited by a need for the student to constantly shift focus between the instrument and the display. XR affords new methods to potentially address this limitation through capabilities to directly augment a musical instrument with RTVF by overlaying 3D virtual objects on the instrument. Due to limited research evaluating effectiveness of this approach, it is unclear how the added cognitive demands of RTVF in virtual environments (VEs) affect the learning process. To fill this gap, the second major contribution of this thesis is the first known user study evaluating the effectiveness of XREMIL. Results of the study show that an XR environment with RTVF improves participant performance during training, but may lead to decreased improvement after the training. On the other hand,interviews with participants indicate that the XR environment increased their confidence leading them to feel more engaged during training. In addition to enhancing CAMIT, the second area of focus in this thesis is the application of XR to NIME enabling virtual environments for musical expression (VEME). Development of VEME requires a workflow that integrates XR development tools with existing sound design tools. This presents numerous technical challenges, especially to novice XR developers. To simplify this process and facilitate VEME development, the third major contribution of this thesis is an open source toolkit, called OSC-XR. OSC-XR makes VEME development more accessible by providing developers with readily available Open Sound Control (OSC) virtual controllers. I present three new VEMEs, developed with OSC-XR, to identify affordances and guidelines for VEME design. The insights gained through these studies exploring the application of XR to musical learning and performance, lead to new affordances and guidelines for the design of effective and engaging MusE-XR.en_US
dc.description.scholarlevelGraduateen_US
dc.identifier.urihttp://hdl.handle.net/1828/10988
dc.languageEnglisheng
dc.language.isoenen_US
dc.rightsAvailable to the World Wide Weben_US
dc.subjectExtended Realityen_US
dc.subjectMixed Realityen_US
dc.subjectMachine Learningen_US
dc.subjectLearning Transferen_US
dc.subjectNew Interfaces for Musical Expressionen_US
dc.subjectComputer Assisted Musical Instrument Tutoringen_US
dc.subjectExtended Reality Experiences for Musical Instrument Learningen_US
dc.subjectVirtual Environment for Musical Expressionen_US
dc.titleMusE-XR: musical experiences in extended reality to enhance learning and performanceen_US
dc.typeThesisen_US

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Johnson_David_PhD_2019.pdf
Size:
15.08 MB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.71 KB
Format:
Item-specific license agreed upon to submission
Description: