search.noResults

search.searching

saml.title
dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
COVER STORY


Harnessing Edge AI for a sustainable path to competitive advantage


Artificial intelligence (AI) remains one of the hottest topics in technology today, driving a remarkable wave of innovations across virtually all industries. From smartphones to media applications and business solutions, AI is increasingly present in our daily lives, with its influence expanding rapidly in various sectors. However, one critical aspect that often gets overlooked is the significant energy consumption associated with AI computing. While AI continues to advance and provide numerous benefits, it’s crucial to acknowledge the environmental impact of these technologies. For instance, a server farm running AI processors can consume up to 160 per cent more energy than a traditional server farm, highlighting the challenges posed by this growth. As organisations increasingly adopt AI solutions, the demand for energy-efficient systems and sustainable practices becomes ever more pressing. Balancing innovation with environmental responsibility is essential to ensure that the rapid advancement of AI does not come at the expense of our planet’s health. In this month’s edition, Andrew Pockson, engineering manager at Anglia, explores how edge computing can be used to reduce energy consumption and latency in AI applications as well as improving reliability, security and overall cost effectiveness.


T


he high power usage associated with artificial intelligence (AI) is largely due to the complexity of the processors utilised, which contain multiple cores specifically designed to handle intensive computations effectively. These advanced


processors are crucial for managing deep learning algorithms, including Large Language Models (LLMs) that power popular AI systems such as ChatGPT and others. As AI applications continue to grow in scale and sophistication, addressing the energy consumption and carbon footprint of the underlying AI infrastructure becomes a key challenge for the industry. Finding effective ways to optimize energy efficiency while still maintaining high performance levels is critical for the sustainable development of AI technology. As a result, the tech industry must strive to balance innovation with sustainability to continue harnessing the immense power of AI without compromising environmental responsibility. This dual focus is essential not only for reducing ecological impacts but also for ensuring that AI technologies can be developed and deployed responsibly in the future.


The technology sector is already taking positive steps and making substantial investments in significant innovations for artificial intelligence (AI), particularly focused on reducing power consumption and enhancing overall efficiency. This ongoing effort involves not only making AI processors more energy-efficient but also reconsidering how and where AI processing occurs. Traditionally, most AI processing has been handled in large data centres, commonly referred to as the cloud. However, it is becoming increasingly clear that not all AI tasks need to be processed at these centralised server farms. Many AI functions can now be performed locally using the processing power available on the user’s device or nearby


10 November 2024 Components in Electronics


gateways that connect those devices to the internet. This shift towards localized processing not only optimizes energy use but also improves response times and performance, making AI applications more accessible and efficient. By leveraging local processing capabilities, the technology sector can further reduce the environmental impact of AI while continuing to drive innovation across various industries.


This approach, known as AI edge computing, brings several notable advantages that significantly enhance the performance and efficiency of artificial intelligence applications. By processing data closer to where it is generated—such as on user devices or local gateways—AI edge computing effectively





smart devices to industrial automation and beyond. As such, it represents a significant step forward in the quest for energy-efficient and secure AI technologies.


Advantages of AI edge computing Reduced latency: AI edge computing processes data locally, near the source of data generation (like IoT devices), which significantly reduces latency or delay compared to cloud computing. This is crucial for real-time applications, such as material handling automated guided vehicles (AGV) used in smart factories and warehouses or healthcare monitoring systems.


 the technology sector can further reduce the


 


reduces latency, leading to much faster response times and improved real-time performance for various applications. Furthermore, this localised processing decreases the reliance on cloud resources, which in turn results in lower energy consumption and reduced operational costs for organisations. Additionally, this method enhances privacy and security, as sensitive data does not always need to be transmitted to the cloud for processing, minimising the risk of data breaches. Overall, AI edge computing has emerged as a crucial innovation in the tech landscape, offering a more sustainable, efficient, and responsive solution for handling AI tasks across a wide array of applications, from


Enhanced reliability: AI systems at the edge can continue to function even when there is limited or no cloud connectivity. This makes edge computing a more reliable option in remote areas or in scenarios where constant internet access isn’t guaranteed. Bandwidth efficiency: In AI cloud computing, substantial amounts of data need to be transmitted to centralised data centres, which consumes substantial bandwidth. Edge computing reduces the amount of data that must be sent to the cloud, conserving network resources and enabling faster decision-making. Cost efficiency: By reducing the need for constant data


www.cieonline.co.uk





Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52  |  Page 53  |  Page 54  |  Page 55  |  Page 56  |  Page 57  |  Page 58  |  Page 59  |  Page 60  |  Page 61  |  Page 62  |  Page 63  |  Page 64  |  Page 65  |  Page 66  |  Page 67  |  Page 68