search.noResults

search.searching

dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
HIGH PERFORMANCE COMPUTING


conversation that green-minded organisations desperately need to start having”


”The sustainability of AI is a


housing those applications can efficiently handle the high- density compute involved, at industrial scale. Many corporate data centres simply are not equipped to handle these demands. According to a survey by Science Direct, out of 100 data centres, 61 per cent were operating with systems running at their lowest efficiency. Likewise, it is crucial that


comparable to the amount produced by 125 round-trip flights from New York to Beijing. Given the significant impact that ever-expanding AI research could have on the environment, it is critical that the field of AI starts to weigh sustainability against utility.


How organisations can power sustainable AI Information and Communications Technology (ICT) already accounts for approximately four per cent of worldwide carbon emissions, according to The Shift Project research, and its contribution to greenhouse gas emissions is 60 per cent higher than the aviation industry. As more enterprises and


organisations turn to AI and machine learning applications


in an effort to drive innovation, there is a corresponding increase in demand for cloud optimised data centre facilities. If Anders Andrae, senior researcher at Huawei, is right in his prediction that by 2025 data centres will account for 33 per cent of global ICT electricity consumption, the sustainability of AI is a conversation that green-minded organisations desperately need to start having. There are positive steps


companies can take to minimise their carbon footprint whilst still accessing cutting- edge supercomputing to drive their innovations. Given that machine learning and deep learning applications consume an enormous amount of energy, companies need to ensure that the data centres


www.scientific-computing.com | @scwmagazine


these facilities are powered by renewable energy sources. If these power-hungry AI applications are housed in fossil-fuel-powered facilities, energy efficiency efforts can quickly become voided. Equally, organisations that rely on cloud service providers should verify their provider’s green credentials. If a cloud provider’s data centre is located somewhere like the UK, which is predominantly powered by natural gas, no matter how many green certificates it boasts, at the end of the day, it is still powered by fossil fuel. Despite what the label suggests, these green certificates do not always signify that the energy being used is renewable. Energy can be certified green even if it is not renewable, as these certificates are akin to a carbon offset programme. Data centre location is also


a key consideration when it comes to sustainable AI. Cooling the air inside data centres can be expensive and relatively inefficient, and in hotter climates, keeping hardware cool is particularly energy-intensive. Vitally, more than 80 per cent of computer hardware does not need to be located near the end-user in terms of latency or accessibility. Acknowledging this, it is both


economically and ecologically sound business practice to house AI servers somewhere with a consistently cool climate. Tech giants, like Google, are


investing in data centres in Nordic countries specifically because of better energy efficiency, as compared to locations in warmer climates. In a conventional data centre, cooling IT equipment constitutes 40 per cent of the total energy consumed. However, in countries like Iceland, which is perennially cool, natural cooling of powerful AI servers minimises energy usage and results in considerable energy savings. Further, Iceland’s energy


is sourced from 100 per cent renewable geothermal and hydroelectric power, and its national grid is modern and reliable, meaning the AI systems housed there operate more efficiently and deliver cleaner energy. By making smarter choices about where AI compute is located, organisations can make a substantial impact on the sustainability of their AI.


The future of AI needs to be green The growth and proliferation of AI shows no signs of abating. According to research firm IDC, worldwide spending on AI systems will be close to $98 billion (£75.4 billion) in 2023. If, as the UN has warned, 2020 is the year that the world must act to avoid runaway climate change, the field of AI – and the organisations that drive it – must begin the new decade by tackling the sustainability of AI head-on, concentrating on energy efficiency from research start to enterprise finish.


Spring 2020 Scientific Computing World 15


designerbd18/DamienVectors/Shutterstock.com


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32