Abstract
© 2018 The Authors Recent advances in behavioural and computational neuroscience, cognitive robotics, and in the hardware implementation of large-scale neural networks, provide the opportunity for an accelerated understanding of brain functions and for the design of interactive robotic systems based on brain-inspired control systems. This is especially the case in the domain of action and language learning, given the significant scientific and technological developments in this field. In this work we describe how a neuroanatomically grounded spiking neural network for visual attention has been extended with a word learning capability and integrated with the iCub humanoid robot to demonstrate attention-led object naming. Experiments were carried out with both a simulated and a real iCub robot platform with successful results. The iCub robot is capable of associating a label to an object with a ‘preferred’ orientation when visual and word stimuli are presented concurrently in the scene, as well as attending to said object, thus naming it. After learning is complete, the name of the object can be recalled successfully when only the visual input is present, even when the object has been moved from its original position or when other objects are present as distractors.
DOI
10.1016/j.robot.2018.02.010
Publication Date
2018-06-01
Publication Title
Robotics and Autonomous Systems
Volume
104
ISSN
0921-8890
Organisational Unit
School of Engineering, Computing and Mathematics
First Page
56
Last Page
71
Recommended Citation
Hernández, G. D., Adams, S., Rast, A., Wennekers, T., Furber, S., & Cangelosi, A. (2018) 'Visual attention and object naming in humanoid robots using a bio-inspired spiking neural network', Robotics and Autonomous Systems, 104, pp. 56-71. Available at: https://doi.org/10.1016/j.robot.2018.02.010