Show simple item record

dc.contributor.supervisorMiranda, Eduardo
dc.contributor.authorKirke, Alexis
dc.contributor.otherFaculty of Arts, Humanities and Businessen_US
dc.date.accessioned2011-12-13T08:49:10Z
dc.date.available2011-12-13T08:49:10Z
dc.date.issued2011
dc.identifier933609en_US
dc.identifier.urihttp://hdl.handle.net/10026.1/895
dc.description.abstract

We investigate the properties of a new Multi-Agent Systems (MAS) for computer-aided composition called IPCS (pronounced “ipp-siss”) the Intermediate Performance Composition System which generates expressive performance as part of its compositional process, and produces emergent melodic structures by a novel multi-agent process. IPCS consists of a small-medium size (2 to 16) collection of agents in which each agent can perform monophonic tunes and learn monophonic tunes from other agents. Each agent has an affective state (an “artificial emotional state”) which affects how it performs the music to other agents; e.g. a “happy” agent will perform “happier” music. The agent performance not only involves compositional changes to the music, but also adds smaller changes based on expressive music performance algorithms for humanization. Every agent is initialized with a tune containing the same single note, and over the interaction period longer tunes are built through agent interaction. Agents will only learn tunes performed to them by other agents if the affective content of the tune is similar to their current affective state; learned tunes are concatenated to the end of their current tune. Each agent in the society learns its own growing tune during the interaction process. Agents develop “opinions” of other agents that perform to them, depending on how much the performing agent can help their tunes grow. These opinions affect who they interact with in the future. IPCS is not a mapping from multi-agent interaction onto musical features, but actually utilizes music for the agents to communicate emotions. In spite of the lack of explicit melodic intelligence in IPCS, the system is shown to generate non-trivial melody pitch sequences as a result of emotional communication between agents. The melodies also have a hierarchical structure based on the emergent social structure of the multi-agent system and the hierarchical structure is a result of the emerging agent social interaction structure. The interactive humanizations produce micro-timing and loudness deviations in the melody which are shown to express its hierarchical generative structure without the need for structural analysis software frequently used in computer music humanization.

en_US
dc.language.isoenen_US
dc.publisherUniversity of Plymouthen_US
dc.subjectComputer Musicen_US
dc.subjectAlgorithmic Composition
dc.subjectExpressive Performance
dc.subjectMulti-Agent Systems
dc.titleApplication of Intermediate Multi-Agent Systems to Integrated Algorithmic Composition and Expressive Performance of Musicen_US
dc.typeThesis
dc.identifier.doihttp://dx.doi.org/10.24382/4900


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record


All items in PEARL are protected by copyright law.
Author manuscripts deposited to comply with open access mandates are made available in accordance with publisher policies. Please cite only the published version using the details provided on the item record or document. In the absence of an open licence (e.g. Creative Commons), permissions for further reuse of content should be sought from the publisher or author.
Theme by 
Atmire NV