Show simple item record

dc.contributor.authormasala, giovanni
dc.contributor.authorGrosso, E
dc.date.accessioned2017-07-04T09:34:04Z
dc.date.available2017-07-04T09:34:04Z
dc.date.issued2014-12
dc.identifier.issn0968-090X
dc.identifier.issn1879-2359
dc.identifier.urihttp://hdl.handle.net/10026.1/9590
dc.description.abstract

Real time monitoring of driver attention by computer vision techniques is a key issue in the development of advanced driver assistance systems. While past work mostly focused on structured feature-based approaches, characterized by high computational requirements, emerging technologies based on iconic classifiers recently proved to be good candidates for the implementation of accurate and real-time solutions, characterized by simplicity and automatic fast training stages.In this work the combined use of binary classifiers and iconic data reduction, based on Sanger neural networks, is proposed, detailing critical aspects related to the application of this approach to the specific problem of driving assistance. In particular it is investigated the possibility of a simplified learning stage, based on a small dictionary of poses, that makes the system almost independent from the actual user.On-board experiments demonstrate the effectiveness of the approach, even in case of noise and adverse light conditions. Moreover the system proved unexpected robustness to various categories of users, including people with beard and eyeglasses. Temporal integration of classification results, together with a partial distinction among visual distraction and fatigue effects, make the proposed technology an excellent candidate for the exploration of adaptive and user-centered applications in the automotive field.

dc.format.extent32-42
dc.languageen
dc.language.isoen
dc.publisherElsevier BV
dc.subjectAutomotive applications
dc.subjectMonitoring of driver attention
dc.subjectDriver assistance systems
dc.subjectNeural networks
dc.titleReal time detection of driver attention: Emerging solutions based on robust iconic classifiers and dictionary of poses
dc.typejournal-article
dc.typeJournal Article
plymouth.author-urlhttps://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=PARTNER_APP&SrcAuth=LinksAMR&KeyUT=WOS:000347596100003&DestLinkType=FullRecord&DestApp=ALL_WOS&UsrCustomerID=11bb513d99f797142bcfeffcc58ea008
plymouth.volume49
plymouth.publication-statusPublished
plymouth.journalTransportation Research Part C: Emerging Technologies
dc.identifier.doi10.1016/j.trc.2014.10.005
plymouth.organisational-group/Plymouth
plymouth.organisational-group/Plymouth/Faculty of Science and Engineering
plymouth.organisational-group/Plymouth/REF 2021 Researchers by UoA
plymouth.organisational-group/Plymouth/REF 2021 Researchers by UoA/UoA11 Computer Science and Informatics
dc.identifier.eissn1879-2359
dc.rights.embargoperiodNo embargo
rioxxterms.versionofrecord10.1016/j.trc.2014.10.005
rioxxterms.licenseref.urihttp://www.rioxx.net/licenses/all-rights-reserved
rioxxterms.typeJournal Article/Review
plymouth.oa-locationhttp://www.sciencedirect.com/science/article/pii/S0968090X14002976?via=ihub


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record


All items in PEARL are protected by copyright law.
Author manuscripts deposited to comply with open access mandates are made available in accordance with publisher policies. Please cite only the published version using the details provided on the item record or document. In the absence of an open licence (e.g. Creative Commons), permissions for further reuse of content should be sought from the publisher or author.
Theme by 
Atmire NV