Show simple item record

dc.contributor.authorHutt, H
dc.contributor.authorEverson, R
dc.contributor.authorGrant, M
dc.contributor.authorLove, J
dc.contributor.authorLittlejohn, George
dc.date.accessioned2017-05-24T19:15:44Z
dc.date.available2017-05-24T19:15:44Z
dc.date.issued2015-06
dc.identifier.issn1432-7643
dc.identifier.issn1433-7479
dc.identifier.urihttp://hdl.handle.net/10026.1/9340
dc.description.abstract

The use of citizen science to obtain annotations from multiple annotators has been shown to be an effective method for annotating datasets in which computational methods alone are not feasible. The way in which the annotations are obtained is an important consideration which affects the quality of the resulting consensus annotation. In this paper, we examine three separate approaches to obtaining consensus scores for instances rather than merely binary classifications. To obtain a consensus score, annotators were asked to make annotations in one of three paradigms: classification, scoring and ranking. A web-based citizen science experiment is described which implements the three approaches as crowdsourced annotation tasks. The tasks are evaluated in relation to the accuracy and agreement among the participants using both simulated and real-world data from the experiment. The results show a clear difference in performance between the three tasks, with the ranking task obtaining the highest accuracy and agreement among the participants. We show how a simple evolutionary optimiser may be used to improve the performance by reweighting the importance of annotators.

dc.format.extent1541-1552
dc.languageen
dc.language.isoen
dc.publisherSpringer Science and Business Media LLC
dc.subjectWeb-based citizen science
dc.subjectClassification
dc.subjectConsensus score
dc.subjectCrowdsourced annotation tasks
dc.subjectEvolutionary optimiser
dc.subjectImage clump
dc.subjectRanking
dc.subjectScoring
dc.subjectInternet
dc.subjectEvolutionary computation
dc.subjectImage classification
dc.subjectPattern clustering
dc.subjectMicroscopy
dc.subjectCorrelation
dc.titleHow clumpy is my image?
dc.typejournal-article
dc.typeJournal Article
plymouth.author-urlhttps://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=PARTNER_APP&SrcAuth=LinksAMR&KeyUT=WOS:000354500300008&DestLinkType=FullRecord&DestApp=ALL_WOS&UsrCustomerID=11bb513d99f797142bcfeffcc58ea008
plymouth.issue6
plymouth.volume19
plymouth.publication-statusPublished
plymouth.journalSoft Computing
dc.identifier.doi10.1007/s00500-014-1303-z
plymouth.organisational-group/Plymouth
plymouth.organisational-group/Plymouth/Faculty of Science and Engineering
plymouth.organisational-group/Plymouth/Faculty of Science and Engineering/School of Biological and Marine Sciences
plymouth.organisational-group/Plymouth/REF 2021 Researchers by UoA
plymouth.organisational-group/Plymouth/REF 2021 Researchers by UoA/UoA06 Agriculture, Veterinary and Food Science
plymouth.organisational-group/Plymouth/Users by role
plymouth.organisational-group/Plymouth/Users by role/Academics
dc.identifier.eissn1433-7479
dc.rights.embargoperiodNot known
rioxxterms.versionofrecord10.1007/s00500-014-1303-z
rioxxterms.licenseref.urihttp://www.rioxx.net/licenses/all-rights-reserved
rioxxterms.typeJournal Article/Review


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record


All items in PEARL are protected by copyright law.
Author manuscripts deposited to comply with open access mandates are made available in accordance with publisher policies. Please cite only the published version using the details provided on the item record or document. In the absence of an open licence (e.g. Creative Commons), permissions for further reuse of content should be sought from the publisher or author.
Theme by 
Atmire NV