search.noResults

search.searching

dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
HIGH PERFORMANCE COMPUTING


The impact of AI


ROB FARBER CONSIDERS THE EFFECT THAT AI IS HAVING ON HPC HARDWARE AND APPLICATION DEVELOPMENT


Overall, machine learning is changing


how scientists perform research and interact with data, with remarkable research efforts bringing AI technology to exascale computation, high-energy physics (HEP) , materials design, climate simulation, and more. Don’t be misled that the impact is


Just as technology changes in the personal computer market brought about a revolution in the design and implementation of the systems and algorithms used in high performance computing (HPC), so are recent technology changes in machine learning bringing about an AI revolution in the HPC community. Expressed as a single word, one can describe the extent of the impact of AI on HPC as ‘everywhere’. Deep learning, in particular, has been


revolutionising visual recognition tasks that previously could only be performed by humans. It is truly remarkable that these deep types of artificial neural networks (DNNs) are able to match and even outperform human abilities on a wide variety of visual tasks. The scientific applications are being investigated by scientists and HPC centres worldwide, using some of the largest supercomputers in the world. Reflecting this trend, the Argonne


Leadership Computing Facility (ALCF) continues its effort to create an environment that supports traditional simulation-based research, as well as emerging data science and machine learning approaches in preparation for Aurora, the first US exascale supercomputer.


4 Scientific Computing World October/November 2018


limited to high-profile research projects or big computer centres. The use of AI in HPC permeates everything HPC, from research and software, to hardware design technology, including the use of data flow hardware such as FPGAs, ASICs, together with the impact on more ‘traditional’ hardware like CPUs and GPUs (graphics processing units). Along with Feynman’s quantum conjecture, researchers believe that machine learning maps well to quantum hardware. Looking to the future, hardware implementations of neuromorphic computing promise to redefine edge computing, using sensors and Internet of Things (IoT) due to orders of magnitude that will increase efficiency. Just as DNNs brought artificial neural


networks (ANNs) into the mainstream, so may neuromorphic computing multiply that impact through ubiquity and the removal of humans from the data-


“This work demonstrates the viability of machine learning models in physics-based simulations” Federico Carminati, project coordinator, CERN


intensive preparation of clean and relevant training sets, as neuromorphic computing systems can identify their own training sets. Overall, big data, extreme computing, and machine learning are causing us to rethink the role of a supercomputer. At the moment, HPC modelling and simulation is experiencing a revolutionary change in mindset as ANNs are now being incorporated directly into strict physics-based codes. More specifically, new algorithmic efforts at high-profile institutions like CERN use GANs (generative adversarial networks), which have been shown to bring orders-of- magnitude increased performance to physics-based modeling and simulation, while preserving accuracy and without introducing non-physical effects. More indirectly, reduced precision support for AI is redefining the numerical and matrix methods that are core to HPC . Everywhere you look, AI is being


incorporated by researchers into new hardware, techniques and technologies.


Same bandwagon – computation driven adoption For a period of roughly 10 years starting in the mid-1980s, the field of machine learning exploded with the advent of nonlinear neural networks, backpropagation, the work of John Holland in genetic algorithms, work in Hidden Markov Models, and precursor work for neuromorphic computing performed by Carver Mead. The field stagnated for a number of


years due to a lack of computational power, coupled with promises made by many who entered the field but did not fully understand the limitations of the technology. As a result of many broken


g @scwmagazine | www.scientific-computing.com


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36