Show simple item record

dc.contributor.authorZahra, Daniel
dc.contributor.authorBurr, Steven
dc.date.accessioned2018-11-26T09:12:26Z
dc.date.available2018-11-26T09:12:26Z
dc.date.issued2018-12
dc.identifier.issn2212-2761
dc.identifier.issn2212-277X
dc.identifier.urihttp://hdl.handle.net/10026.1/12868
dc.description.abstract

INTRODUCTION: Ongoing monitoring of cohort demographic variation is an essential part of quality assurance in medical education assessments, yet the methods employed to explore possible underlying causes of demographic variation in performance are limited. Focussing on properties of the vignette text in single-best-answer multiple-choice questions (MCQs), we explore here the viability of conducting analyses of text properties and their relationship to candidate performance. We suggest that such analyses could become routine parts of assessment evaluation and provide an additional, equality-based measure of an assessment's quality and fairness. METHODS: We describe how a corpus of vignettes can be compiled, followed by examples of using Microsoft Word's native readability statistics calculator and the koRpus text analysis package for the R statistical analysis environment for estimating the following properties of the question text: Flesch Reading Ease (FRE), Flesch-Kincaid Grade Level (Grade), word count, sentence count, and average words per sentence (WpS). We then provide examples of how these properties can be combined with equality and diversity variables, and the process automated to provide ongoing monitoring. CONCLUSIONS: Given the monitoring of demographic differences in assessment for assurance of equality, the ability to easily include textual analysis of question vignettes provides a useful tool for exploring possible causes of demographic variations in performance where they occur. It also provides another means of evaluating assessment quality and fairness with respect to demographic characteristics. Microsoft Word provides data comparable to the specialized koRpus package, suggesting routine use of word processing software for writing items and assessing their properties is viable with minimal burden, but that automation for ongoing monitoring also provides an additional means of standardizing MCQ assessment items, and eliminating or controlling textual variables as a possible contributor to differential attainment between subgroups.

dc.format.extent401-407
dc.format.mediumPrint
dc.languageeng
dc.language.isoen
dc.publisherSpringer (part of Springer Nature)
dc.subjectAssessment questions
dc.subjectEquality and diversity
dc.subjectLanguage complexity
dc.titleAnalysis of question text properties for equality monitoring.
dc.typejournal-article
dc.typeJournal Article
plymouth.author-urlhttps://www.ncbi.nlm.nih.gov/pubmed/30353285
plymouth.issue6
plymouth.volume7
plymouth.publication-statusPublished online
plymouth.journalPerspectives on Medical Education
dc.identifier.doi10.1007/s40037-018-0478-x
plymouth.organisational-group/Plymouth
plymouth.organisational-group/Plymouth/Faculty of Health
plymouth.organisational-group/Plymouth/Users by role
plymouth.organisational-group/Plymouth/Users by role/Academics
dc.publisher.placeNetherlands
dcterms.dateAccepted2018-10-08
dc.rights.embargodate2019-1-11
dc.identifier.eissn2212-277X
dc.rights.embargoperiodNot known
rioxxterms.versionofrecord10.1007/s40037-018-0478-x
rioxxterms.licenseref.urihttp://www.rioxx.net/licenses/all-rights-reserved
rioxxterms.typeJournal Article/Review


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record


All items in PEARL are protected by copyright law.
Author manuscripts deposited to comply with open access mandates are made available in accordance with publisher policies. Please cite only the published version using the details provided on the item record or document. In the absence of an open licence (e.g. Creative Commons), permissions for further reuse of content should be sought from the publisher or author.
Theme by 
Atmire NV