search.noResults

search.searching

saml.title
dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
Software solutions


used as safety components. However, if a Notified body certifies compliance, then the certification expires after five years and the AI system would have to be reassessed and recertified.


The easiest way to demonstrate conformity with the requirements is to apply standards that are harmonised to the AI Act. So far, the standards have not yet been written or harmonised, but these are due to be ready by the end of April 2025. If suitable standards are not available, or if the European Commission finds them to be inadequate, the AI Act allows for Implementing Acts to establish common specifications. Complying with these common specifications will provide a presumption of conformity in the same way as complying with harmonised standards.


In addition to the harmonised standards and/or common specifications, the European Commission will publish guidelines on the application of the AI Act. Once an AI system has been assessed as being in compliance with the requirements of the Act, a Declaration of Conformity (DoC) can be drawn up. If an AI system is embedded within another product such as a machine, the DoC can be incorporated within the machine’s Machinery Directive DoC. Similarly, the technical documentation compiled for compliance with the AI Act can be incorporated within that relating to the Machinery Directive.


Instrumentation Monthly April 2025


To indicate the claimed compliance, high-risk AI systems must have a physical CE marking applied. Where this is not possible, the CE mark should be applied to the packaging or accompanying documentation. The physical marking may be complemented by a digital CE marking. For high-risk AI systems that are only provided digitally, a digital CE marking should be used. If a CE marked machine features an embedded high-risk AI system used as a safety component, the machine’s CE marking must indicate compliance with both the Machinery Directive and the AI Act.


An important point to note for providers outside the EU is that the AI Act requires an Authorised Representative (AR) to be appointed. The AR must be a natural or legal person in the EU and possess a mandate from the provider to perform certain tasks under the AI Act. Both the instructions and DoC must show the AR’s identity and contact details. A point to note about the detail of the AI Act is that high-risk AI systems must be designed so that natural persons can oversee their functioning. For an AI system used as a safety component, it will be interesting to see how machine builders fulfil this requirement.


ONGOING OBLIGATIONS After a high-risk AI system has been placed on the market, the provider is obliged to undertake


post-market monitoring for the system’s lifetime. If any serious incidents occur, these must be reported to the relevant market surveillance authority.


High-risk AI systems used as safety components are required to have automatic event logging, which will assist with post-market surveillance.


If an AI system undergoes substantial modification, then its conformity must be reassessed. ‘Substantial modification’ includes using an AI system for a purpose for which it was not originally intended.


PENALTIES FOR NON-COMPLIANCE Within the AI Act, there are rules governing penalties for non-compliance, including failing to provide the relevant authorities with information or access upon request. Penalties can apply to providers, deployers, importers, distributors and authorised representatives, as well as notified bodies. Penalties take the form of fines that can be up to EUR15million or 3 per cent of worldwide annual turnover. Fines relating to prohibited AI systems are substantially higher.


For advice on complying with the AI Act (Artificial Intelligence Regulation (EU) 2024/1689), as well as information about appointing an Authorised Representative, contact Safe Machin


Safe Machine www.safemachine.co.uk 31


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52  |  Page 53  |  Page 54  |  Page 55  |  Page 56  |  Page 57  |  Page 58  |  Page 59  |  Page 60  |  Page 61  |  Page 62  |  Page 63  |  Page 64  |  Page 65  |  Page 66  |  Page 67  |  Page 68  |  Page 69  |  Page 70  |  Page 71  |  Page 72  |  Page 73  |  Page 74  |  Page 75  |  Page 76  |  Page 77  |  Page 78  |  Page 79  |  Page 80