Feature: Embedded AI
Figure 3: Processor options for running AI/ML
has spurred a wave of domain-specific architectures, chips designed not for broad computing tasks but to accelerate neural networks with much greater efficiency yet lower power consumption. NPUs, unlike CPUs, excel at parallelised matrix multiplications, executing thousands of operations simultaneously with minimal power.
Addressing safety and compliance Using AI/ML in real-time, safety- critical embedded systems (e.g., autonomous vehicles, medical devices, industrial robots) is challenging due to these models’ inherent non- determinism. Their behaviour can change with different inputs or configurations, raising the potential for unpredictability. A common approach to mitigate this risk is to convert the model to a static and unmodifiable model after training. This is a critical step for ensuring determinism in safety- critical systems and is commonly known as “freezing the model”. Take for example automotive
manufacturers designing an AI system for something really high-stakes, like a self-driving car. Lives depend on it behaving exactly the same way every single time so there are no surprises. But AI models, especially neural networks, can sometimes act inconsistently: a bit slower one day because the hardware got warm, or they round a number differently on a GPU versus a processor. This is safety engineer’s nightmare.
Freezing the model “locks” every connection, weight and math operation, disabling further tweaks or updates
Freezing the model, in effect, “locks”
every connection, weight and math operation, disabling further tweaks or updates. Tesla’s Autopilot is a good example. Its lane-detection AI uses frozen neural networks, proving to regulators (ISO 26262, etc.) that the system behaves exactly as tested, in every car, under all conditions. If Tesla didn’t freeze the model, a software update or hardware quirk might subtly change how the car “sees” the road, which will invalidate the regulatory requirements. Freezing allows Tesla to say: “We’ve certified this version, and it won’t drift over time”. However, freezing alone isn’t enough
to pass certifications. Imagine freezing a model that uses 32-bit floating- point numbers. Even tiny differences in how chips handle decimals could cause inconsistencies, like a medical robot calculating a 0.1mm incision as 0.1000001 one day and 0.0999999 the next. For certifications, that’s unacceptable: They want to know exactly what’s running on the device. A frozen model becomes a static file, say,
34 February 2026
www.electronicsworld.co.uk
a .tflite file (TensorFlow Lite model) or .onnx file (Open Neural Network Exchange model), that engineers can hand to auditors and say: “Here’s the AI, line by line, byte by byte. Test it once, and it’ll behave the same way in all ten million cars”. Tesla’s frozen models, for instance, are baked into their firmware updates, which undergo rigorous ASIL-D validation – the strictest automotive safety level. But even with freezing, guardrails are
still necessary. A frozen AI might still make a wild guess if it sees something totally new, like a self-driving car encountering a chair in the middle of the road. That’s why systems pair frozen models with rule-based fallbacks (e.g., “if the AI’s confidence drops below 95%, then slam the brakes”) and runtime monitors that cross-check outputs against physics or common sense.
Traditional verification in AI-enabled embedded systems Embedded safety-critical systems that incorporate AI components must still adhere to foundational verification and validation practices, including static analysis, unit testing, code coverage and traceability. These practices remain critical to ensuring system integrity, regulatory compliance and safety, though their implementation may require adaptation to address the unique challenges posed by AI. • Static analysis retains its importance even in AI-driven systems. While traditional code, such as control logic, sensor interfaces and safety
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44 |
Page 45 |
Page 46 |
Page 47 |
Page 48