Show simple item record

dc.contributor.supervisorCulverhouse, Phil
dc.contributor.authorAdams, Samantha
dc.contributor.otherSchool of Engineering, Computing and Mathematicsen_US
dc.date.accessioned2013-05-14T10:55:39Z
dc.date.available2013-05-14T10:55:39Z
dc.date.issued2013
dc.date.issued2013
dc.identifier10148014en_US
dc.identifier.urihttp://hdl.handle.net/10026.1/1464
dc.descriptionFull version unavailable due to 3rd party copyright restrictions.
dc.description.abstract

This project applies principles from the field of Computational Neuroscience to Robotics research, in particular to develop systems inspired by how nature manages to solve sensorimotor coordination tasks. The overall aim has been to build a self-organising sensorimotor system using biologically inspired techniques based upon human cortical development which can in the future be implemented in neuromorphic hardware. This can then deliver the benefits of low power consumption and real time operation but with flexible learning onboard autonomous robots. A core principle is the Self-Organising Feature Map which is based upon the theory of how 2D maps develop in real cortex to represent complex information from the environment. A framework for developing feature maps for both motor and visual directional selectivity representing eight different directions of motion is described as well as how they can be coupled together to make a basic visuomotor system. In contrast to many previous works which use artificially generated visual inputs (for example, image sequences of oriented moving bars or mathematically generated Gaussian bars) a novel feature of the current work is that the visual input is generated by a DVS 128 silicon retina camera which is a neuromorphic device and produces spike events in a frame-free way. One of the main contributions of this work has been to develop a method of autonomous regulation of the map development process which adapts the learning dependent upon input activity. The main results show that distinct directionally selective maps for both the motor and visual modalities are produced under a range of experimental scenarios. The adaptive learning process successfully controls the rate of learning in both motor and visual map development and is used to indicate when sufficient patterns have been presented, thus avoiding the need to define in advance the quantity and range of training data. The coupling training experiments show that the visual input learns to modulate the original motor map response, creating a new visual-motor topological map.

en_US
dc.description.sponsorshipEPSRC, University of Plymouth Graduate Schoolen_US
dc.language.isoenen_US
dc.publisherUniversity of Plymouthen_US
dc.subjectSpiking SOM
dc.subjectAdaptive Plasticity
dc.subjectRobotics
dc.subjectBiologically Inspireden_US
dc.titleThe Development of Bio-Inspired Cortical Feature Maps for Robot Sensorimotor Controllersen_US
dc.typeThesis
plymouth.versionEdited versionen_US
dc.identifier.doihttp://dx.doi.org/10.24382/4850


Files in this item

Thumbnail
Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record


All items in PEARL are protected by copyright law.
Author manuscripts deposited to comply with open access mandates are made available in accordance with publisher policies. Please cite only the published version using the details provided on the item record or document. In the absence of an open licence (e.g. Creative Commons), permissions for further reuse of content should be sought from the publisher or author.
Theme by 
Atmire NV