search.noResults

search.searching

saml.title
dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
Clinical engineering


“We also need to consider, what absolute accuracy should we expect?” Mark commented. “Don’t forget radiologists get it wrong around 8%-12% of the time.” He used the analogy of autonomous cars.


If a Tesla kills someone on a zebra crossing, it becomes global news. People respond by saying that that “autonomous cars are rubbish”. Meanwhile, millions of people are killed in road traffic accidents around the world due to human error, yet this doesn’t make the news. “We need to overcome these mental blocks. What error rate will you accept: 1%, 2%, 3%, or 4%? Is it acceptable for an AI algorithm to misdiagnose 5% of the time?” he continued. In addition, when ensuring best practice, he said that it is important to consider: l Are you getting the accuracy claimed by the AI vendor?


l How do you guard against changes in the AI or in the imaging chain?


l It is important to ensure AI serves all patients (avoiding blind spots).


l How do you increase confidence in AI while ensuring best practice?


When advancing care in AI, the following questions must be asked: l When do we trust AI to diagnose autonomously?


l How do we actually increase efficiency/ reduce our backlog?


l What will happen if we trust AI for certain use cases?


l How do we track evidence that AI improves patient outcomes?


Other key considerations include the fact that AI cannot contextualise the patient and their journey (yet) – it is just pattern recognition. He added that, in the future, we will have AI that can see the blood tests, scans, patient history and GP information. It will be able to build multiple neural networks and an overarching neural network. It will start to make predictions even before the scan – whether there is a tumour, or if there is some other disease that may become evident in five years’ time, for example. In summary, Mark pointed out that AI in diagnostics can be applied to improve image quality, workflow and, eventually, diagnostic accuracy. He further emphasised that CE marking, at any level, does not validate AI to run autonomously. It is currently illegal to let AI in diagnostics in imaging run autonomously for diagnosis, despite misinformation in the market suggesting otherwise. Finally, there is no ground-truth available on a wide scale – only raw data. Ultimately, he warned that AI for diagnostics


breathe difference


loewensteinmedical.com


Löwenstein Medical UK Ltd, I E-Centre, Easthampstead Road, Bracknell RG12 1NF


t: 01344 830023 e: info@loewensteinmedical.co.uk w: loewensteinmedical.co.uk


September 2023 I www.clinicalservicesjournal.com 45


t


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52  |  Page 53  |  Page 54  |  Page 55  |  Page 56  |  Page 57  |  Page 58  |  Page 59  |  Page 60  |  Page 61  |  Page 62  |  Page 63  |  Page 64  |  Page 65  |  Page 66  |  Page 67  |  Page 68  |  Page 69  |  Page 70  |  Page 71  |  Page 72  |  Page 73  |  Page 74  |  Page 75  |  Page 76