Trend
NPUs the main rival to Nvidia’s AI dominance
By Dorian Maillard, vice president, DAI Magister F
uelled by the rise of Generative AI, the semiconductor market is experiencing great profitability – it is already worth over $600bn, with projections to reach $1 trillion
by 2030. Nvidia is providing graphics processing units (GPUs),
which play a big role in AI applications. Despite efforts from companies like Microsoſt, AWS and Google to develop their own AI GPU chips, Nvidia remains the clear frontrunner in the AI hardware market, due to the high performance of its solutions and a well-established ecosystem. However, GPUs are expensive and energy intensive, so their
high prices and energy consumption are calling into question whether their widespread use is sustainable in the long run. It is estimated that a single AI search query consumes up to ten times more energy than a standard Google search, highlighting the need for initiatives that mitigate the costs and carbon footprint of AI, whilst remaining competitive.
Going green and powerful Environmental concerns are driving the development of more energy-efficient algorithms and hardware, which will lay the foundations for the mass adoption of domain-specific processors, optimised for efficient execution of AI tasks– these processors are known as neural processing units (NPUs). NPUs are engineered to accelerate the processing of AI tasks, including deep learning and inference. Tey can process large
Environmental concerns are
driving the development of more energy-efficient algorithms and hardware, which could lay the foundations for the mass adoption of domain-specific processors optimised for executing AI tasks efficiently
volumes of data in parallel, and swiſtly execute complex AI algorithms through specialised on-chip memory for efficient data storage and retrieval. While GPUs possess greater processing power and
versatility, NPUs are smaller, less expensive and more energy efficient. Counterintuitively, NPUs can also outperform GPUs in specific AI tasks due to their specialised architecture. Among the key NPU applications are enhancing efficiency
and productivity in industrial IoT and automation technology, powering infotainment systems and autonomous driving in the automotive sector, enabling high-performance smartphone cameras, augmented reality, facial and emotion recognition, and fast data processing. GPUs and NPUs can also be used together for even better efficiencies. In data centres and machine-learning/deep- learning environments, the two technologies complementing one another can accelerate the training of AI models, especially when energy conservation and low latency are required.
Additional funding on the way We expect fundraising activity in the AI-related NPU edge device sector to continue an upward trajectory. Several factors will drive this momentum: the growing importance of AI in almost all industries, increasing investments in R&D, and a surge in demand for high-performance, low-power chips. Moreover, with larger tech giants like Microsoſt, AWS
and Google actively seeking to develop or acquire AI chip technologies, market consolidation is on the cards. Tese tech behemoths are not only seeking to expand their capabilities, but they also try to keep competitive against Nvidia’s formidable presence.
igor-omilaev-eGGFZ5X2LnA-unsplash 04 September 2024
www.electronicsworld.co.uk
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44 |
Page 45 |
Page 46 |
Page 47 |
Page 48 |
Page 49 |
Page 50 |
Page 51 |
Page 52 |
Page 53 |
Page 54