ORCID
- Li, Chunxu: 0000-0001-7851-0260
Abstract
In this paper, the application of Augmented Reality (AR) for the control and adjustment of robots has been developed, with the aim of making interaction and adjustment of robots easier and more accurate from a remote location. A LeapMotion sensor based controller has been investigated to track the movement of the operator hands. The data from the controller allows gestures and the position of the hand palm’s central point to be detected and tracked. A Kinect V2 camera is able to measure the corresponding motion velocities in x, y, z directions after our investigated post-processing algorithm is fulfilled. Unreal Engine 4 is used to create an AR environment for the user to monitor the control process immersively. Kalman filtering (KF) algorithm is employed to fuse the position signals from the LeapMotion sensor with the velocity signals from the Kinect camera sensor, respectively. The fused/optimal data are sent to teleoperate a Baxter robot in real-time by User Datagram Protocol (UDP). Several experiments have been conducted to test the validation of the proposed method.
DOI
10.3390/s19204586
Publication Date
2019-10-22
Publication Title
Sensors
Volume
19
Issue
20
ISSN
1424-8220
Embargo Period
2020-06-02
Organisational Unit
School of Engineering, Computing and Mathematics
First Page
4586
Last Page
4586
Recommended Citation
Li, C., Fahmy, A., & Sienz, J. (2019) 'An Augmented Reality Based Human-Robot Interaction Interface Using Kalman Filter Sensor Fusion', Sensors, 19(20), pp. 4586-4586. Available at: https://doi.org/10.3390/s19204586