Regulatory
initial regulatory approval then covers the changes outlined in this plan, allowing for rapid changes and improvements to systems within requiring a recertification process each time. This serves to front load some of the work for medical device manufacturers seeking approval of AI products. But for Dennis, the evolving regulatory picture on AI does present one notable area in which such companies will need to step up – namely, post-market activities. “With medical devices, you have always got to be consistently on alert, and that is your quality management system. But the thing is that most companies are not very good at this,” she says. “They go through the process of getting certificates for their devices, and they get the certificate and get the champagne out, and they sit back. But actually, it really doesn’t end there.”
Nowhere is that truer than with AI, given products will be constantly evolving by their very nature. “Medical device companies are going to have to really upskill in post-market surveillance and monitoring, to be able to show they’re confident their devices are working as they should,” suggests Dennis. It is not the only likely skill gap. Data privacy may also present an area in which manufacturers will need to build new understandings. That’s because clause 69 of the EU act states that the right to privacy and to protection of data “must be guaranteed throughout the entire life cycle of the AI system”.
“But companies normally have engineers who have become regulatory professionals. What they don’t have are data privacy professionals,” says Dennis. “So they are going to have to hire those into their regulatory teams.”
There are also likely to be complicated conversations to be had about data sharing in the context of AI-supported medical devices. For manufacturers (referred to as “providers” in the EU act) to be able to monitor their devices, they need data. Yet that data is collected by the hospitals or clinics using said devices (referred to as ‘deployers’ in the act). “Basically, unless you have obligations on your deployer to provide you with that data, and to do the surveys to let you have access to your patients, you are not going to be able to get the data that you need for your post-market surveillance,” says Dennis. Meanwhile, the deployer also has obligations to ensure the safe use of devices, and they cannot do that without support from manufacturers. Yet the support needed in the context of AI may not necessarily be something there is currently the expertise to provide. “Reps from manufacturers are going to have to be trained in looking for signs that indicate problems with AI systems, trained in what they have to check, and are going to need to do more than just watch it being used,” suggests Dennis. “They’re going to have to be
www.medicaldevice-developments.com
able to grab the data to be able to do trend reporting for their devices.”
Data privacy, transparency and post-market surveillance In addition, the EU act introduces a separate but linked requirement to be transparent about how AI devices work. “I think writing the documentation that goes with an AI system is going to become an art in itself: explaining it, how it works, what it does, what it doesn’t do, and things it might do but you’re not sure about,” says Dennis. “I think that’s going to be a slightly uncomfortable process.” With the measures of the EU act due to be fully implemented from August 2026, this and related processes are ones with which manufacturers need to start grappling. For Corman, doing so is increasingly part of such companies making their best possible contribution to society.
“Medical device manufacturers help make the world a better place and help save lives,” Corman says. “But there is a cost to connectivity of devices, and there is cost to complexity. I think manufacturers need to make sure they know what that cost is, and be sure that they include the agency and values of patients they claim to serve in the course of their design. We need humane technology.” ●
17
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44 |
Page 45 |
Page 46 |
Page 47 |
Page 48 |
Page 49 |
Page 50 |
Page 51 |
Page 52 |
Page 53 |
Page 54 |
Page 55 |
Page 56 |
Page 57 |
Page 58 |
Page 59 |
Page 60 |
Page 61 |
Page 62 |
Page 63 |
Page 64 |
Page 65 |
Page 66 |
Page 67 |
Page 68 |
Page 69 |
Page 70 |
Page 71 |
Page 72 |
Page 73 |
Page 74 |
Page 75 |
Page 76 |
Page 77 |
Page 78 |
Page 79 |
Page 80 |
Page 81 |
Page 82 |
Page 83 |
Page 84 |
Page 85 |
Page 86 |
Page 87 |
Page 88 |
Page 89 |
Page 90 |
Page 91 |
Page 92 |
Page 93 |
Page 94 |
Page 95 |
Page 96 |
Page 97 |
Page 98 |
Page 99 |
Page 100 |
Page 101 |
Page 102 |
Page 103 |
Page 104 |
Page 105 |
Page 106 |
Page 107 |
Page 108 |
Page 109 |
Page 110 |
Page 111 |
Page 112 |
Page 113 |
Page 114 |
Page 115 |
Page 116 |
Page 117 |
Page 118 |
Page 119 |
Page 120 |
Page 121 |
Page 122 |
Page 123 |
Page 124 |
Page 125 |
Page 126 |
Page 127 |
Page 128 |
Page 129 |
Page 130