ORCID

Abstract

The Usability Evaluation of Digital Health Technologies: Methods and InsightsINOCENCIO DANIEL CORTES MARAMBAAbstract Background: Digital health technology holds promise for improving healthcare delivery, but usability is crucial for adoption and user safety. Whilst large companies often have user experience (UX) expertise, Small and Medium Enterprises (SMEs) developing these technologies lack guidance for conducting usability evaluations. Furthermore, evidence of proper user testing is now required by Digital Technology Assessment Criteria (DTAC) being used by the National Health Service (NHS). Aim: To find ways to offer practical help to SMEs to carry out user experience testing. Methods: There were three stages to this research: (a) A literature review to identify existing methods and barriers. (b) Four case studies were undertaken, through usability evaluations of the following digital health applications (i) My PreOp: A preoperative assessment software, (ii) Wholesome World: A nutritional advice mobile app, (iii) Cornish GP Practice Websites: Websites used for accessing electronic consultations in Cornwall, and (iv) Baby Check App: A symptom checker for young children. To synthesize the findings from these case studies and answer the research questions, I employed a cross-case synthesis approach. This method involved systematically comparing the experiences and lessons learned from each evaluation. The results of this synthesis guided the development of a comprehensive usability evaluation toolkit specifically designed for SMEs. (c) Development of a usability evaluation toolkit tailored for SMEs. Results: (a) The scoping review analysed 133 articles that met the inclusion criteria, published between April 2014 and October 2017. The most frequent methods used were questionnaires (n=105), followed by task completion (n=57), 'Think-Aloud' (n=45), interviews (n=37), heuristic testing (n=18), and focus groups (n=13). However, limited resources and lack of user access often hinder robust evaluations. (b) The sample sizes and the key quantitative findings of the case studies were as follows:  Case Study 1: MyPreOp. Total Respondents: 2593 patients provided complete data across two phases (Phase 1: n=1193; Phase 2: n=1400). In Phase 1, 80% of respondents had a "good or better" experience, and 90% found the system "easy to use". In Phase 2, the usability rating using the Health-ITUES was 4.13 out of a maximum of 5, indicating high usability. Dividing the Health-ITUES questionnaire into five blocks of four questions had only a small effect on completion times (mean increase of seven minutes) and minimal impact on the variance of ratings (epsilon squared = 0.011).  Case Study 2: Wholesome World. Respondents: 20 respondents (mean age 39.85 years; 60% female; 35% with cancer).  The app scored highly in usability with an overall Health-ITUES mean score of 4.79 (sd=0.26) out of a maximum of 5. The heuristic evaluation scored the app at 40 out of 52 items on the usability checklist.  Case Study 3: GP Practice Websites. Websites Reviewed: 10 GP websites in January 2020 (5 high-use, 5 low-use) and all 57 Cornish GP practice websites in summer 2021. In summer 2021, most (38/57) websites had at least one accessibility error, and most (39/57) had at least one area of poor contrast according to WCAG 2.1. 79% (45/57) of websites allowed access to eConsult or another triage system in one or two clicks; however, six took three clicks, and six had no link. Practices requiring more clicks to reach eConsult tended to have lower rates of use (e.g., 1 click: mean 122.8 eConsults; 3 clicks: mean 52.4 eConsults), although this correlation was not statistically significant (p=0.06).  Case Study 4: Baby Check App. Respondents: 60 healthcare professionals participated in the survey, and 25 health visitors participated in the focus group. 88% (53/60) of respondents were familiar with the Baby Check App. Of those familiar, 96% (51/53) had recommended it to parents/carers. Respondents gave a median rating of 5 out of 5 (IQR=1) for the usefulness of the app in supporting parental decision-making. 21% (11/53) reported experiencing challenges, with inability to download and lack of other languages being common issues.  The case studies identified common barriers including lack of specific guidance for SMEs, difficulty accessing user test groups, and burden of participation, especially for users with complex conditions. Enablers included dividing lengthy questionnaires, combining validated tools with bespoke questions, and utilizing heuristic walkthroughs and automated accessibility checks. The case studies provided valuable insights into practical application of these methods. (c) Based on these findings, a usability evaluation toolkit was developed specifically for SMEs. The toolkit is organized by development stages, suggesting appropriate usability activities, expected outputs, and resource estimations. Discussion: There are many methods that SMEs can use to address the usability of their products and services and advising them depends on the nature of their product and their stage of development. I have been able to summarise best practices from the literature and my experiences from the four case studies in this toolkit, which should help SMEs. The prototype toolkit should benefit stakeholders as follows: For researchers: It provides a foundation for further research such as validating usefulness of the toolkit, adding methods to evaluate new interfaces (such as virtual reality and voice assistants), and creating an interactive version of the toolkit. For developers: It offers concrete guidance for conducting user experience evaluations and collecting evidence to satisfy DTAC criteria. For practitioners / policy makers: The toolkit informs health professionals about patient involvement in usability evaluations and guides policy makers in assessing evidence submitted by developers. This research contributes to improved usability evaluation practices in digital health technologies developed by SMEs, ultimately enhancing patient safety and user experience. Recommendations: Further research is needed to validate the toolkit's effectiveness and to update it for emerging technologies like virtual reality and voice interfaces. Developing an interactive digital version of the toolkit is also recommended.

Awarding Institution(s)

University of Plymouth

Award Sponsors

European Regional Development Fund

Supervisor

Ray Jones, Arunangsu Chatterjee, Craig Newman, Edward Meinert

Keywords

eHealth, User experience, Usability, Digital health, Digital health techonologies

Document Type

Thesis

Publication Date

2025

Embargo Period

2025-09-19

Deposit Date

September 2025

Creative Commons License

Creative Commons Attribution-NonCommercial 4.0 International License
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License

Share

COinS