Show simple item record

dc.contributor.authorChen, Jen
dc.contributor.authorGlover, Men
dc.contributor.authorYang, Cen
dc.contributor.authorLi, Cen
dc.contributor.authorLi, Zen
dc.contributor.authorCangelosi, Aen
dc.date.accessioned2020-07-09T05:41:17Z
dc.date.available2020-07-09T05:41:17Z
dc.date.issued2017-01-01en
dc.identifier.isbn9783319641065en
dc.identifier.issn0302-9743en
dc.identifier.urihttp://hdl.handle.net/10026.1/15865
dc.description.abstract

© Springer International Publishing AG 2017. In this paper, a novel interface of human-robot interaction has been developed to provide enhanced user experience for teleoperators. The interface has been implemented and tested on a Baxter robot platform and it can be easily adapted to other robot platforms. The main objective of this work is to provide a teleoperator immersive experience when controlling a telerobot arm by enabling the user to see and feel what the robot sees and feels from a first person point of view. This objective has been achieved by our designed interface integrating a haptic feedback device, a virtual reality headset, and an RGB-D camera. An operator can manipulate a robotic arm and receive force feedback information about interactions between the robot’s grippers, as well as the robot’s environment, whilst viewing the captured visual information of the robot’s workspace, on the screen of the virtual reality headset. A servo motor driving platform has been designed as a new robot head to manipulate the camera on top of it, such that a teleoperator is able to control the pose of the camera in a natural manner via the wearable virtual reality headset. The orientation of the built-in inertial measurement unit (IMU) of the virtual reality headset is used to directly command the angles of the head platform on which the camera is mounted. The operator will have an immersive and in-depth experience when manipulating the robotic arm. Extensive tests with a variety of users have been carried out to evaluate the design in this work with quantified analysis.

en
dc.format.extent1 - 15en
dc.language.isoenen
dc.titleDevelopment of an immersive interface for robot teleoperationen
dc.typeConference Contribution
plymouth.volume10454 LNAIen
plymouth.publication-statusPublisheden
plymouth.journalLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)en
dc.identifier.doi10.1007/978-3-319-64107-2_1en
plymouth.organisational-group/Plymouth
plymouth.organisational-group/Plymouth/Faculty of Science and Engineering
plymouth.organisational-group/Plymouth/Faculty of Science and Engineering/School of Engineering, Computing and Mathematics
plymouth.organisational-group/Plymouth/REF 2021 Researchers by UoA
plymouth.organisational-group/Plymouth/REF 2021 Researchers by UoA/UoA11 Computer Science and Informatics
plymouth.organisational-group/Plymouth/Research Groups
plymouth.organisational-group/Plymouth/Research Groups/Institute of Health and Community
plymouth.organisational-group/Plymouth/Research Groups/Marine Institute
plymouth.organisational-group/Plymouth/Users by role
plymouth.organisational-group/Plymouth/Users by role/Academics
dc.identifier.eissn1611-3349en
dc.rights.embargoperiodNot knownen
rioxxterms.versionofrecord10.1007/978-3-319-64107-2_1en
rioxxterms.licenseref.urihttp://www.rioxx.net/licenses/all-rights-reserveden
rioxxterms.typeConference Paper/Proceeding/Abstracten


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record


All items in PEARL are protected by copyright law.
Author manuscripts deposited to comply with open access mandates are made available in accordance with publisher policies. Please cite only the published version using the details provided on the item record or document. In the absence of an open licence (e.g. Creative Commons), permissions for further reuse of content should be sought from the publisher or author.
Theme by 
@mire NV