search.noResults

search.searching

saml.title
dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
Regulatory


a carmaker considering their vehicle safety functions, to a financial institution using a credit scoring model. However, given the inroads that AI has made into healthcare, life sciences companies will stand to be some of the businesses most affected. The rules lay out four tiers of risk, from ‘unacceptable risk’ at the top (like social scoring by governments) to limited or minimal risk at the bottom (AI-enabled video games). In the middle sit medical devices, many of which are deemed ‘high risk’ by definition.


“Like many other regulated products, AI systems will need to have a CE marking if they pose a high risk to safety, high risk to health, or high risk in terms of fundamental rights,” explains Vladimir Murovec, the Belgium head of life sciences regulatory at Osborne Clarke. “That’s also true if your product is already regulated on the European market – for instance, if it’s also a medical device algorithm or an in vitro diagnostic software.” To put it differently, medical device manufacturers will clearly need to pay attention to the rules. But they certainly won’t be the only ones affected. A notable feature of the AI Act is its extension of accountability to ‘deployers of AI systems within the supply chain’ – in practice meaning that anyone who uses these technologies for business purposes will be subject to additional scrutiny.


“The impact goes through the entire supply chain, from AI being used in the preclinical stages of clinical trials, to predicting market trends,” Murovec stresses. “Healthcare professionals, care centres, dental clinics, and any medtech company that uses AI in a business context, will all be subject to the ‘deployer’s obligation’. So this goes really far in terms of scope, and I think the impact is going to be quite deep.”


Innovation vs regulation


For the past few years, medical devices within the EU have been regulated by the Medical Devices Regulation (MDR) and the In Vitro Diagnostic Regulation (IVDR), which came into force in May 2021 and May 2022 respectively. To an extent, notes Alexander Olbrechts, director of digital health at MedTech Europe, there is some overlap between the existing rules and the new AI Act. “Examples of elements addressed in both regulations include risk management and quality management requirements, technical documentation and the need for undergoing conformity procedures,” says Olbrechts. “It is critical that, per the AI Act’s Article 8.2, the duplicative or additional requirements of the AI Act can simply be integrated into existing MDR/IVDR processes and procedures and existing documentation.”


Medical Device Developments / www.nsmedicaldevices.com


By way of example, AI-based medical device software is already regulated by the MDR/IVDR. Under these regulations, manufacturers need to submit the software to a so-called ‘Notified Body’ – responsible for assessing its safety and performance. The hope is that, under the AI Act, existing MDR/ IVDR software codes will be maintained. “This will be a key instrument to mitigate the risk of additional and unnecessary assessments, and by extension avoid any barrier to innovation in the European medical technology sector,” Olbrechts adds. “If we can arrive at a point whereby the AI Act and MDR/IVDR work seamlessly and complimentarily, it will go a long way to generate that trust in AI-enabled medical technologies.”


Indeed, the regulators have made a concerted effort not to create unnecessary burdens for businesses. Declaring that AI “can contribute to solving” a range of societal challenges, they’ve made it clear that they do not want to stifle innovation or delay market entry for emergent technologies. At the same time, hitting the brakes for a time may not always be a bad thing. After all, the main idea behind the AI Act is to ensure that AI systems are safe, ethical and accessible – a situation the industry would likely favour even if it meant more bureaucracy.


“When you’re talking about healthcare, in the same way as when you’re talking about driverless cars, you want to make sure that its safety has been thoroughly interrogated,” points out Will James, the international sector head of life sciences and healthcare at Osborne Clarke. As well as contending with the ‘deployer’s obligation’ for the first time, life sciences companies will face new obligations around accuracy, cybersecurity, monitoring and transparency, extending throughout the whole supply chain. Providers will need to go through additional pre-market assessments to build up their technical documentation, while new Notified Bodies will have to be accredited too. Existing manufacturers, whose products are already on the market, will equally be obliged to conduct a thorough review to make sure their applications comply. Beyond that, there are various areas of uncertainty on which the industry will be seeking further guidance. For one thing, it isn’t yet clear whether devices deployed in clinical trials will need to be certified by the AI Act beforehand, or whether they will qualify for a so-called ‘research exemption’. “The AI Act’s research exemption tells you, ‘well, actually, you don’t need to comply with the new regulation if your AI system is being specifically developed and put into service solely for research purposes,’” says Murovec. “The impact of this


€35m


The top fine payable (or 7% of global annual revenue) for breaching the AI Act.


Wilmer Hale 19


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52  |  Page 53  |  Page 54  |  Page 55  |  Page 56  |  Page 57  |  Page 58  |  Page 59  |  Page 60  |  Page 61  |  Page 62  |  Page 63  |  Page 64  |  Page 65  |  Page 66  |  Page 67  |  Page 68  |  Page 69  |  Page 70  |  Page 71  |  Page 72  |  Page 73  |  Page 74  |  Page 75  |  Page 76  |  Page 77  |  Page 78  |  Page 79  |  Page 80  |  Page 81  |  Page 82  |  Page 83  |  Page 84  |  Page 85  |  Page 86  |  Page 87  |  Page 88  |  Page 89  |  Page 90  |  Page 91  |  Page 92  |  Page 93  |  Page 94  |  Page 95  |  Page 96  |  Page 97  |  Page 98  |  Page 99  |  Page 100  |  Page 101  |  Page 102  |  Page 103  |  Page 104  |  Page 105  |  Page 106  |  Page 107  |  Page 108  |  Page 109  |  Page 110  |  Page 111  |  Page 112  |  Page 113  |  Page 114  |  Page 115  |  Page 116  |  Page 117  |  Page 118  |  Page 119  |  Page 120  |  Page 121  |  Page 122  |  Page 123  |  Page 124  |  Page 125  |  Page 126  |  Page 127  |  Page 128  |  Page 129  |  Page 130  |  Page 131  |  Page 132  |  Page 133  |  Page 134  |  Page 135  |  Page 136  |  Page 137  |  Page 138  |  Page 139  |  Page 140  |  Page 141  |  Page 142  |  Page 143  |  Page 144  |  Page 145  |  Page 146  |  Page 147  |  Page 148  |  Page 149  |  Page 150