search.noResults

search.searching

saml.title
dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
Gorodenkoff/Shutterstock.com


HPC > Silicon Photonics


HPC cloud and AI computing continue to grow, more scientists and researchers are getting access to advanced computing infrastructure. There is still significant growth in the cloud and AI markets. As the demand for AI supercomputers grows, steps must be taken to make the technology more sustainable. Several key market verticals in life


sciences, such as biotechnology and pharmaceuticals, are shifting towards the use of AI for image classification of medical images and to aid in the search for new drugs and treatments. In engineering, there has been a shift from traditional testing and validation to complex digital twins and with comprehensive digital verification and testing in simulation. The automotive market is also seeing demand for training large-scale models for autonomous driving and the use of AI in other applications such as topology optimisation. Compounding this effect in science and engineering is the growth of data-intensive workloads. Integrating photonic technologies


from The University of Oxford, it’s fitting that we work together on the Innovate UK project further to enhance the capabilities of these advanced optical systems.”


Accelerating HPC and GPU computing The continued scaling of HPC and AI systems relies on significant energy efficiency increases. Today’s largest supercomputers can consume as much as 20MW and with the continued demand for AI, energy demand may rise. Reducing energy efficiency is not a


new concept for HPC system providers or those that manage or provision HPC clusters. For many years a 20MW power envelope has been discussed for exascale computing. Frontier may have missed the 20MW target by a small margin, but the system demonstrates how far the HPC industry has come in developing efficient HPC systems. For example, The Sunway TaihuLight supercomputer was the world’s most powerful system when it launched in 2016. In the November 2022 edition of the Top500, this system is now in seventh position, delivering a peak performance of 93 petaflops with a power consumption of 15,371kW (15.3MW). By comparison, the LUMI system, launched in 2022, delivers almost three times the performance of Sunway TaihuLight at 309 petaflops but consumes just a fraction of the power at 6,016kW (6MW). However, as


into classical computing could help to significantly reduce the power budgets of moving data across systems. While this may seem insignificant, training of AI models requires large amounts of data. Photonic computing promises to open


up new possibilities for data transfer and communications between computing elements, which could significantly increase HPC performance. Several early players have merged in this market with technologies that could find their way into future supercomputers. Lightelligence is an MIT spin-out using photonics to reinvent computing for artificial intelligence. The company launched its first fully integrated optical computing platform, PACE (Photonic Arithmetic Computing Engine), in 2021. PACE leverages the inherent properties of light to generate optimal solutions to the Ising, Max-Cut, and Min-Cut problems more than 800 times faster than current high-end GPUs while maintaining high throughput, low latency and energy efficiency. In a 2021 interview with MIT News, LightIntelligence CEO Dr Yichen Shen said: “We’re changing the fundamental way computing is done, and I think we’re doing it at the right time in history. We believe optics will be the next computing platform, at least for linear operations like AI.”


The PACE platform also has the


potential for use in autonomous driving systems, and has shown itself to be more powerful than high-end GPUs in some applications. “Our chip completes these decision-making tasks at a fraction of the time of regular chips, which would enable the AI system within the car to make much quicker decisions and more precise decisions, enabling safer driving,” said Shen. The core technology that underpins


PACE is a 64x64 optical matrix multiplier in an integrated silicon photonic chip and a CMOS microelectronic chip, flipchip


‘We believe optics will be the next computing platform, at least for linear operations like AI’


Dr Yichen Shen, CEO LightIntelligence


packaged together. In addition to its advanced 3D packaging, PACE’s photonic chip contains more than 12,000 discrete photonic devices and has a system clock of 1GHz.


Overcoming the memory bottleneck In 2022, Hewlett Packard Enterprise and photonic computing startup Ayar Labs signed a multi-year strategic collaboration to accelerate the networking performance of computing systems and data centres by developing silicon photonics solutions based on optical I/O technology. This was soon followed by news that Ayar Labs had secured $130m in additional funding from Boardman Bay Capital Management, Hewlett Packard Enterprise (HPE) and Nvidia, as well as multiple new and existing financial investors, which include GlobalFoundries and Intel Capital. Silicon photonics will enhance


networking capabilities and support future requirements for high performance computing (HPC), artificial intelligence (AI), and cloud computing architectures.


For more info about silicon photonics, visit: www.scientific-computing.com/hpc


SCIENTIFIC COMPUTING WORLD


Summer 2023 Scientific Computing World 27 g


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42