Wahlich, C;
Chandrasekaran, L;
Chaudhry, UAR;
Willis, K;
Chambers, R;
Bolter, L;
Anderson, J;
Shakespeare, R;
Olvera-Barrios, A;
Fajtl, J;
et al.
Wahlich, C; Chandrasekaran, L; Chaudhry, UAR; Willis, K; Chambers, R; Bolter, L; Anderson, J; Shakespeare, R; Olvera-Barrios, A; Fajtl, J; Welikala, R; Barman, S; Egan, CA; Tufail, A; Owen, CG; Rudnicka, AR
(2024)
Patient and practitioner perceptions around use of artificial intelligence within the English NHS diabetic eye screening programme.
Diabetes Res Clin Pract, 219.
p. 111964.
ISSN 1872-8227
https://doi.org/10.1016/j.diabres.2024.111964
SGUL Authors: Wahlich, Charlotte Amy
|
PDF
Published Version
Available under License Creative Commons Attribution. Download (1MB) | Preview |
|
![]() |
Microsoft Word (.docx) (Supplementary Data 1)
Supplemental Material
Download (19kB) |
Abstract
AIMS: Automated retinal image analysis using Artificial Intelligence (AI) can detect diabetic retinopathy as accurately as human graders, but it is not yet licensed in the NHS Diabetic Eye Screening Programme (DESP) in England. This study aims to assess perceptions of People Living with Diabetes (PLD) and Healthcare Practitioners (HCP) towards AI's introduction in DESP. METHODS: Two online surveys were co-developed with PLD and HCP from a diverse DESP in North East London. Surveys were validated through interviews across three centres and distributed via DESP centres, charities, and the British Association of Retinal Screeners. A coding framework was used to analyse free-text responses. RESULTS: 387 (24%) PLD and 98 (37%) HCP provided comments. Themes included trust, workforce impact, the patient-practitioner relationship, AI implementation challenges, and inequalities. Both groups agreed AI in DESP was inevitable, would improve efficiency, and save costs. Concerns included job losses, data security, and AI decision safety. A common misconception was that AI would directly affect patient interactions, though it only processes retinal images. CONCLUSIONS: Limited understanding of AI was a barrier to acceptance. Educating diverse PLD groups and HCP about AI's accuracy and reliability is crucial to building trust and facilitating its integration into screening practices.
Item Type: | Article | ||||||||
---|---|---|---|---|---|---|---|---|---|
Additional Information: | Crown Copyright © 2024 Published by Elsevier B.V. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/). | ||||||||
Keywords: | Artificial intelligence, Diabetes, Qualitative, Screening, Technology, 1103 Clinical Sciences, 1117 Public Health and Health Services, 1701 Psychology, Endocrinology & Metabolism | ||||||||
SGUL Research Institute / Research Centre: | Academic Structure > Population Health Research Institute (INPH) | ||||||||
Journal or Publication Title: | Diabetes Res Clin Pract | ||||||||
ISSN: | 1872-8227 | ||||||||
Language: | eng | ||||||||
Dates: |
|
||||||||
Publisher License: | Creative Commons: Attribution 4.0 | ||||||||
PubMed ID: | 39709112 | ||||||||
Go to PubMed abstract | |||||||||
URI: | https://openaccess.sgul.ac.uk/id/eprint/117053 | ||||||||
Publisher's version: | https://doi.org/10.1016/j.diabres.2024.111964 |
Statistics
Actions (login required)
![]() |
Edit Item |