Abstract
Gaze stabilization is an important requisite for humanoid robots. Previous work on this topic has focused on the integration of inertial and visual information. Little attention has been given to a third component, which is the knowledge that the robot has about its own movement. In this work we propose a comprehensive framework for gaze stabilization in a humanoid robot. We focus on the problem of compensating for disturbances induced in the cameras due to self-generated movements of the robot. In this work we employ two separate signals for stabilization: (1) an anticipatory term obtained from the velocity commands sent to the joints while the robot moves autonomously; (2) a feedback term from the on board gyroscope, which compensates unpredicted external disturbances. We first provide the mathematical formulation to derive the forward and the differential kinematics of the fixation point of the stereo system. We finally test our method on the iCub robot. We show that the stabilization consistently reduces the residual optical flow during the movement of the robot and in presence of external disturbances. We also demonstrate that proper integration of the neck DoF is crucial to achieve correct stabilization.
DOI
10.1109/humanoids.2014.7041369
Publication Date
2014-11-01
Event
2014 IEEE-RAS 14th International Conference on Humanoid Robots (Humanoids 2014)
Publication Title
2014 IEEE-RAS International Conference on Humanoid Robots
Publisher
IEEE
Embargo Period
2024-11-22
Recommended Citation
Roncone, A., Pattacini, U., Metta, G., & Natale, L. (2014) 'Gaze stabilization for humanoid robots: A comprehensive framework', 2014 IEEE-RAS International Conference on Humanoid Robots, . IEEE: Available at: https://doi.org/10.1109/humanoids.2014.7041369