Comment EDITOR’S COMMENTwith LOUISE FRAMPTON THE CLINICAL SERVICES JOURNAL Editor
Louise Frampton
louiseframpton@stepcomms.com
Technical Editor Kate Woodhead
Journal Administration Katy Cockle
katycockle@stepcomms.com
Design Steven Dillon
Sales Executive
Steve Elliman
stephenelliman@stepcomms.com
Business Manager James Scrivens
jamesscrivens@stepcomms.com
Publisher Geoff King
geoffking@stepcomms.com
Publishing Director Trevor Moon
trevormoon@stepcomms.com
STEP COMMUNICATIONS ISSN No. 1478-5641
© Step Communications Ltd, 2025 Single copy: £19.00 per issue. Annual journal subscription: UK £114.00 Overseas: £150.00
The Clinical Services Journal is published in January, February, March, April, May, June, August, September, October and November by Step Communications Ltd, Step House,
North Farm Road, Tunbridge Wells, Kent TN2 3DR, UK.
Tel: +44 (0)1892 779999 Email:
info@clinicalservicesjournal.com Web:
www.clinicalservicesjournal.com
The Publisher is unable to take any responsibility for views
expressed by contributors. Editorial views are not necessarily shared by the journal. Readers are expressly advised that while the contents of this publication are believed to be accurate, correct and complete, no reliance should be placed upon its contents as being applicable to any particular circumstances.
This publication is copyright under the Berne Convention and the International Copyright Convention.
All rights reserved, apart from any copying under the UK
Copyright Act 1956, part 1, section 7. Multiple copies of the contents of the publication without permission is always illegal.
Follow us: @csjmagazine
Follow the CSJ LinkedIn page. Search Clinical Services Journal
Navigating the ethics of AI in healthcare
In this latest edition, CSJ looks at discussion around the ethical implications of artificial intelligence (AI) in healthcare. In their article, ‘Implementing AI: the ethical frontier’, Kenza Benkirane and Dr. Julia Mokhova warn that AI should be “an assistant not a doctor”. They point out that the integration of AI in acute healthcare settings represents “an extraordinary opportunity and a significant challenge”. But what do patients think? Are they concerned about the use of AI in their care? Do they want
to talk to ‘bots’? And are they worried about the use of their data? Research published by the Health Foundation shows that the public is hesitant about
technologies that might be seen to ‘distance’ patients from healthcare staff or come between them, such as care robots. People named the ability to see and talk to NHS staff as their most important consideration when thinking about technology use in healthcare, with older people particularly concerned about this (39% of people aged 65 years and older named this as their most important consideration). In the survey (of over 7,000 members of the public), three-quarters (75%) said that they would support sharing some of their personal health data for the development of AI systems in the NHS. The majority were willing to share data on areas such as their eye health (59%), medicines they are taking (58%) and any long-term illnesses they live with (57%). However, the results found less willingness to share some types of data, with only 47% willing to share smart phone tracked data such as sleep activity and 44% willing to share sexual health information (fewer than the number, 46%, who were opposed). The poll also revealed differences in willingness to share data between socioeconomic groups.
Notably, people from socioeconomic groups D and E (in households where the main earner is semi-skilled, unskilled or not in work) are significantly less likely to support the use of any of their health data for AI development than people from other socioeconomic groups. For example, while 16% of people from socioeconomic group A are not happy for any of their health data to be used, this compares with 39% of people from socioeconomic group E. Analysing the results, the Health Foundation noted these findings highlight the importance of engaging all social groups in the development of new technologies, to ensure they work for everyone and do not inadvertently create or worsen inequalities. The Health Foundation’s Director of Innovation and Improvement, Dr. Malte Gerhold, said: “It is
encouraging that most people are open to sharing their data to develop AI systems in the NHS. When properly implemented, we know that AI has the potential to free up staff by supporting clinical and administrative tasks. However, these systems are only as good as the data used to design and develop them. “There are significant differences between socioeconomic groups in levels of support for sharing data for AI development and for taking part in activities to shape how technology is used in the NHS. Policymakers, NHS leaders and those involved in designing and implementing health care technologies must proactively engage with people across different social groups to ensure that healthcare technologies help tackle inequalities, rather than worsen them.” The technology is exciting and transformative, but bringing a diverse demographic of patients, across all backgrounds, on board may be the biggest challenge of all. We need to be mindful of the potential for bias, AI’s technical limitations (including ‘drift’), and its ethics, as we navigate unfamiliar territory. Ultimately, patients need to be part of the discussion. They need to be involved in how we integrate these systems in healthcare, and they need to be part of the discussion on where we integrate these systems. After all, “no decision about me, without me” must also apply to the implementation of technology.
louiseframpton@stepcomms.com Get in touch and give us your views, email me:
January 2025 I
www.clinicalservicesjournal.com 5
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44 |
Page 45 |
Page 46 |
Page 47 |
Page 48 |
Page 49 |
Page 50 |
Page 51 |
Page 52 |
Page 53 |
Page 54 |
Page 55 |
Page 56 |
Page 57 |
Page 58 |
Page 59 |
Page 60 |
Page 61 |
Page 62 |
Page 63 |
Page 64