Show simple item record

dc.contributor.supervisorCangelosi, Angelo
dc.contributor.authorBelmonte Klein, Frederico
dc.contributor.otherSchool of Engineering, Computing and Mathematicsen_US
dc.date.accessioned2020-11-09T16:07:39Z
dc.date.issued2020
dc.date.issued2020
dc.identifier10512143en_US
dc.identifier.urihttp://hdl.handle.net/10026.1/16638
dc.description.abstract

This thesis describes the classification human activity for the purpose of detecting falls, later extended to multiple activities. This was done with the final intention of implementing a robotic companion for older persons that can provide a certain level of automated care in case of some sort of emergency. The complexity of this work, combined with restrictions of the robot, motivated a creation of an infrastructure abstraction to allow deferred (decentralized) processing. The initial work was done by implementing classifiers that work use pre-processed skeleton data extracted from RGBD sensors and implements some steps in order to make classification robust to changes. RGB-D classification first focused on falling detection and then extended into general activities which could be classified from skeleton data. A later attempt used CNNs for classification of video footage of activities. All of those algorithms were modified to output classifications in real time. Results achieved were around 90% in accuracy for a simple fall vs. not fall activity in the TST fall v2 dataset, 70% global combined accuracy for the 12 actions of the CAD60 using skeleton data and 75% accuracy for the 51 actions on the HMDB51, all of those showing close to state-of-the-art performance on datasets. On new activity data based on skeletons and video, however, results were less encouraging with 33.5% accuracy on skeleton data and 37.9% accuracy based on video.

While these results do not allow for a robotic platform to perform action detection currently, the overarching structure of systems necessary to execute it was demonstrated and used successfully, opening up doors for future research using more complex systems.

en_US
dc.description.sponsorshipCNPq Brazil (scholarship 232590/2014-1)en_US
dc.language.isoen
dc.publisherUniversity of Plymouth
dc.subjectHuman activity recognitionen_US
dc.subjectActivity detectionen_US
dc.subjectSocially assistive robotsen_US
dc.subjectFall detectionen_US
dc.subjectClassifieren_US
dc.subject.classificationPhDen_US
dc.titleFall and activity detection framework on a robotic platform for older person careen_US
dc.typeThesis
plymouth.versionpublishableen_US
dc.identifier.doihttp://dx.doi.org/10.24382/1148
dc.rights.embargodate2021-11-09T16:07:39Z
dc.rights.embargoperiod12 monthsen_US
dc.type.qualificationDoctorateen_US
rioxxterms.versionNA
plymouth.orcid.id0000-0003-4316-0496en_US


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record


All items in PEARL are protected by copyright law.
Author manuscripts deposited to comply with open access mandates are made available in accordance with publisher policies. Please cite only the published version using the details provided on the item record or document. In the absence of an open licence (e.g. Creative Commons), permissions for further reuse of content should be sought from the publisher or author.
Theme by 
Atmire NV