Abstract
Alignment is a phenomenon observed in human conversation: Dialog partners’ behavior converges in many respects. Such alignment has been proposed to be automatic and the basis for communicating successfully. Recent research on human–computer dialog promotes a mediated communicative design account of alignment according to which the extent of alignment is influenced by interlocutors’ beliefs about each other. Our work aims at adding to these findings in two ways. (a) Our work investigates alignment of manual actions, instead of lexical choice. (b) Participants interact with the iCub humanoid robot, instead of an artificial computer dialog system. Our results confirm that alignment also takes place in the domain of actions. We were not able to replicate the results of the original study in general in this setting, but in accordance with its findings, participants with a high questionnaire score for emotional stability and participants who are familiar with robots align their actions more to a robot they believe to be basic than to one they believe to be advanced. Regarding alignment over the course of an interaction, the extent of alignment seems to remain constant, when participants believe the robot to be advanced, but it increases over time, when participants believe the robot to be a basic version.
DOI
10.1007/s12369-014-0252-0
Publication Date
2015-04-01
Publication Title
International Journal of Social Robotics
Volume
7
Issue
2
Publisher
Springer Science and Business Media LLC
ISSN
1875-4805
Embargo Period
2024-11-22
First Page
241
Last Page
252
Recommended Citation
Vollmer, A., Rohlfing, K., Wrede, B., & Cangelosi, A. (2015) 'Alignment to the Actions of a Robot', International Journal of Social Robotics, 7(2), pp. 241-252. Springer Science and Business Media LLC: Available at: https://doi.org/10.1007/s12369-014-0252-0