search.noResults

search.searching

saml.title
dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
Focus


DeepSeek-R1 are a step in the right direction, but they must be paired with the right AI-enabled silicon to unlock their true potential.


Why open-weight AI models matter DeepSeek is part of a larger movement toward open AI innovation, following in the footsteps of models like LLaMA and Mistral. By off ering transparency and fl exibility through customisation, open-weight models enable developers to fi ne-tune AI for specialised applications (industrial IoT, automotive, robotics, etc.); reduce dependence on cloud providers for inference, lowering costs and increasing control; and optimise performance for edge deployments, where compute resources are limited. With DeepSeek-R1, this movement is


evolving further. T e distillation approach used in R1 allows for smaller, more-effi cient models that still retain high reasoning capabilities. T is is critical for edge AI,


where power and memory constraints make deploying large models unfeasible.


AI at the edge Open AI models like DeepSeek-R1 are just one part of the equation. To make AI truly viable at the edge, we need hardware designed to handle these models effi ciently and cost eff ectively. At Synaptics, we’ve built the Astra


platform with this exact challenge in mind. Astra is an AI-native compute platform designed for power-effi cient, multimodal AI inference in embedded and IoT devices. By leveraging Arm Cortex-A processors and tightly integrated AI acceleration, Astra enables real-time AI processing at the edge –no need for cloud offl oading. T e distilled models from DeepSeek-R1


provide an ideal complement to this approach. T ese models maintain high performance in reasoning tasks while signifi cantly reducing compute


A little ‘support’ to help with the biggest projects


Ensure that your wireless equipment and devices are compliant and ready for the market.


   


 


tuvsud.com/uk From concept to completion, ATC Semitec


is your go-to partner for all your temperature monitoring needs. With over 25 years


technical expertise and cutting edge solutions – trust us to provide the extra support that makes the difference in your biggest projects.


01606 871680


smart@atcsemitec.co.uk atcsemitec.co.uk


RF testing - Electronics World - 120 x 86.indd 1 www.electronicsworld.co.uk September 2025 11 9/4/2025 11:01:55 AM


requirements, making them well suited for AI-native edge devices. T is synergy between open AI models, distillation and optimised edge compute will defi ne the next phase of AI innovation. Imagine a world where smart home devices can process user interactions locally, preserving privacy and reducing latency. Or where industrial sensors leverage AI for real-time anomaly detection, preventing costly downtime. Or, indeed, where AI-driven medical devices provide real-time diagnostics without requiring cloud connectivity. In these types of applications open-weight AI models with AI-native processors will redefi ne what’s possible. As AI adoption accelerates, the industry is


recognising that proprietary, cloud-centric models alone won’t be enough. Open- weight AI like DeepSeek-R1 represents a pivotal shiſt toward scaleable, customisable and effi cient intelligence, but to truly bring AI everywhere, we need compute platforms built for real-world constraints.


Contact us


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44