ORCID
- Li, Chunxu: 0000-0001-7851-0260
Abstract
With the requirements for improving life quality, companion robots have gradually become hotspot of application for healthy home living. In this article, a novel bionic human-robot interaction (HRI) strategy using stereo vision algorithms has been developed to imitate the animal vision system on the Owl robot. Depth information of a target is found via two methods, vergence and disparity. Vergence requires physical tracking of the target, moving each camera to align with a chosen object, and through successive camera movements (saccades) a sparse depth map of the scene can be built up. Disparity however requires the cameras to be fixed and parallel, using the position of the target within the field of view, of a stereo pair of cameras, to calculate distance. As disparity does not require the cameras to move, multiple targets can be chosen to build up a disparity map, providing depth information for the whole scene. In addition, a salience model is implemented imitating how people explore a scene. This is achieved with feature maps, which apply filtering to the scene to highlight areas of interest, for example color and edges, which is purely a bottom-up approach based on Itti and Koch's saliency model. A series of experiments have been conducted on Plymouth Owl robot to validate the proposed interface.
DOI
10.1002/adc2.54
Publication Date
2020-10-11
Publication Title
Advanced Control for Applications: Engineering and Industrial Systems
Embargo Period
2021-08-07
Organisational Unit
School of Engineering, Computing and Mathematics
Recommended Citation
Rogers, J., Culverhouse, P., Wickenden, B., & Li, C. (2020) 'Development of a bionic interactive interface for Owl robot using stereo vision algorithms', Advanced Control for Applications: Engineering and Industrial Systems, . Available at: https://doi.org/10.1002/adc2.54