Federico Visi


A constantly growing corpus of interdisciplinary studies support the idea that music is a complex multimodal medium that is experienced not only by means of sounds but also through body movement. From this perspective, musical instruments can be seen as technological objects coupled with a repertoire of performance gestures. This repertoire is part of an ecological knowledge shared by musicians and listeners alike. It is part of the engine that guides musical experience and has a considerable expressive potential. This thesis explores technical and conceptual issues related to the analysis and creative use of music-related body movements in instrumental music performance. The complexity of this subject required an interdisciplinary approach, which includes the review of multiple theoretical accounts, quantitative and qualitative analysis of data collected in motion capture laboratories, the development and implementation of technologies for the interpretation and interactive use of motion data, and the creation of short musical pieces that actively employ the movement of the performers as an expressive musical feature. The theoretical framework is informed by embodied and enactive accounts of music cognition as well as by systematic studies of music-related movement and expressive music performance. The assumption that the movements of a musician are part of a shared knowledge is empirically explored through an experiment aimed at analysing the motion capture data of a violinist performing a selection of short musical excerpts. A group of subjects with no prior experience playing the violin is then asked to mime a performance following the audio excerpts recorded by the violinist. Motion data is recorded, analysed, and compared with the expert’s data. This is done both quantitatively through data analysis xii as well as qualitatively by relating the motion data to other high-level features and structures of the musical excerpts. Solutions to issues regarding capturing and storing movement data and its use in real-time scenarios are proposed. For the interactive use of motion-sensing technologies in music performance, various wearable sensors have been employed, along with different approaches for mapping control data to sound synthesis and signal processing parameters. In particular, novel approaches for the extraction of meaningful features from raw sensor data and the use of machine learning techniques for mapping movement to live electronics are described. To complete the framework, an essential element of this research project is the com- position and performance of études that explore the creative use of body movement in instrumental music from a Practice-as-Research perspective. This works as a test bed for the proposed concepts and techniques. Mapping concepts and technologies are challenged in a scenario constrained by the use of musical instruments, and different mapping ap- proaches are implemented and compared. In addition, techniques for notating movement in the score, and the impact of interactive motion sensor systems in instrumental music practice from the performer’s perspective are discussed. Finally, the chapter concluding the part of the thesis dedicated to practical implementations describes a novel method for mapping movement data to sound synthesis. This technique is based on the analysis of multimodal motion data collected from multiple subjects and its design draws from the theoretical, analytical, and practical works described throughout the dissertation. Overall, the parts and the diverse approaches that constitute this thesis work in synergy, contributing to the ongoing discourses on the study of musical gestures and the design of interactive music systems from multiple angles.

Document Type


Publication Date