Show simple item record

dc.contributor.supervisorMiranda, Eduardo
dc.contributor.authorLivingstone, Dan
dc.contributor.otherSchool of Engineering, Computing and Mathematicsen_US
dc.date.accessioned2013-01-22T12:00:18Z
dc.date.available2013-01-22T12:00:18Z
dc.date.issued2009
dc.date.issued2009
dc.identifier755040en_US
dc.identifier.urihttp://hdl.handle.net/10026.1/1252
dc.description.abstract

In order to develop successful collaborative music systems a variety of subtle interactions need to be identified and integrated. Gesture capture, motion tracking, real-time synthesis, environmental parameters and ubiquitous technologies can each be effectively used for developing innovative approaches to instrument design, sound installations, interactive music and generative systems. Current solutions tend to prioritise one or more of these approaches, refining a particular interface technology, software design or compositional approach developed for a specific composition, performer or installation environment. Within this diverse field a group of novel controllers, described as ‘Tangible Interfaces’ have been developed. These are intended for use by novices and in many cases follow a simple model of interaction controlling synthesis parameters through simple user actions. Other approaches offer sophisticated compositional frameworks, but many of these are idiosyncratic and highly personalised. As such they are difficult to engage with and ineffective for groups of novices. The objective of this research is to develop effective design strategies for implementing collaborative sound environments using key terms and vocabulary drawn from the available literature. This is articulated by combining an empathic design process with controlled sound perception and interaction experiments. The identified design strategies have been applied to the development of a new collaborative digital instrument. A range of technical and compositional approaches was considered to define this process, which can be described as Adaptive Social Composition. Dan Livingstone

en_US
dc.language.isoenen_US
dc.publisherUniversity of Plymouthen_US
dc.subjectCo-Creation
dc.subjectInteraction Design
dc.subjectCollaborative Behaviour
dc.subjectAdaptive Music
dc.subjectNovel Interfaces
dc.subjectComputer Musicen_US
dc.titleDesign Strategies for Adaptive Social Composition: Collaborative Sound Environmentsen_US
dc.typeThesis
plymouth.versionFull versionen_US
dc.identifier.doihttp://dx.doi.org/10.24382/1471


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record


All items in PEARL are protected by copyright law.
Author manuscripts deposited to comply with open access mandates are made available in accordance with publisher policies. Please cite only the published version using the details provided on the item record or document. In the absence of an open licence (e.g. Creative Commons), permissions for further reuse of content should be sought from the publisher or author.
Theme by 
Atmire NV