Modelling mental rotation in cognitive robots
MetadataShow full item record
Mental rotation concerns the cognitive processes that allow an agent mentally to rotate the image of an object in order to solve a given task, for example to say if two objects with different orientations are the same or different. Here we present a system-level bio-constrained model, developed within a neurorobotics framework, that provides an embodied account of mental rotation processes relying on neural mechanisms involving motor affordance encoding, motor simulation and the anticipation of the sensory consequences of actions (both visual and proprioceptive). This model and methodology are in agreement with the most recent theoretical and empirical research on mental rotation. The model was validated through experiments with a simulated humanoid robot (iCub) engaged in solving a classical mental rotation test. The results of the test show that the robot is able to solve the task and, in agreement with data from psychology experiments, exhibits response times linearly dependent on the angular disparity between the objects. This model represents a novel detailed operational account of the embodied brain mechanisms that may underlie mental rotation. © The Author(s) 2013.
The following license files are associated with this item: