Focus
DeepSeek and the future of Open AI: What it means for edge computing
By John Weil, Vice President of IoT and Edge AI Processor Business, Synaptics
T
he AI industry is at an infl ection point. For years, deep learning advancements have been driven by
massive, proprietary models trained in the cloud, making AI adoption an expensive and centralised endeavor. But a new shiſt is underway, one that emphasizes openness, effi ciency and scaleability, particularly for edge computing. DeepSeek, an emerging open-weight
AI model, is a powerful example of this trend. Its development highlights the growing movement toward democratising AI, providing developers and enterprises with new ways to integrate intelligence across devices without the constraints of proprietary, cloud-based models. With the latest release of DeepSeek-R1, this trend is accelerating. DeepSeek-R1
is trained via large-scale reinforcement learning from human feedback, allowing it to develop strong reasoning capabilities autonomously. Benchmark results show that it performs on par with OpenAI-o1-1217 on tasks like maths, coding and factual knowledge retrieval. Moreover, DeepSeek-R1 includes distilled versions (1.5B, 7B, 14B, 32B, 70B) optimised for effi ciency, making it highly relevant for edge computing. But open-weight AI models alone aren’t
enough. For AI at the edge to reach its full potential it requires effi cient, AI-native compute platforms that can handle these models in real-world scenarios. T is is where innovations in low-power, high- performance MPUs and MCUs play a crucial role.
10 September 2025
www.electronicsworld.co.uk
The AI compute challenge AI workloads today are increasingly constrained by compute demands. T e dominant model of AI deployment has been centred around large-scale cloud inference, where models like GPT-4 or Gemini require massive GPU clusters to function eff ectively. While this approach works for centralised setups, it becomes impractical for edge-based applications like smart cameras, industrial automation and intelligent IoT devices that require processing in real time and autonomy. T is challenge has driven demand for effi cient AI models that can run closer to the data source, minimising latency, power consumption and connectivity dependencies, whilst enhancing security and privacy. Open-weight models like
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44