search.noResults

search.searching

saml.title
dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
HPC YEARBOOK 2021/22 Evolving AI


With the growth of AI and DL comes new opportunities for emerging applications, finds Robert Roe


numbers of applications available to scientists and researchers who are adopting these methodologies to increase research output. In addition to emerging applications in AI, the accelerator technologies developed for AI or machine learning are now finding new applications in more traditional HPC and scientific computing use cases. Nvidia has recently announced a collaboration with biopharmaceutical company AstraZeneca and the University of Florida’s academic health centre, UF Health, on new AI research projects using transformer neural networks. Transformer-based neural network


A


architectures – which have become available only in the last several years – allow researchers to leverage massive datasets using self-supervised training methods, avoiding the need for manually labelled examples during pre- training. These models, equally adept at learning the syntactic rules to describe chemistry as they are at learning the grammar of languages, are finding applications across research domains and modalities. Nvidia is collaborating with


AstraZeneca on a transformer-based generative AI model for chemical structures used in drug discovery that will be among the very first projects to run on Cambridge-1, which is now recognised as the UK’s largest supercomputer.


The model will be open-sourced, available to researchers and developers 24


in the Nvidia NGC software catalogue, and deployable in the Nvidia Clara Discovery platform for computational drug discovery. Separately, UF Health is harnessing


Nvidia’s state-of-the-art Megatron framework and BioMegatron pre- trained model – available on NGC – to develop GatorTron, the largest clinical language model to date. New NGC applications include


AtacWorks, a deep learning model that identifies accessible regions of DNA, and MELD, a tool for inferring the structure of biomolecules from sparse, ambiguous or noisy data. This is just one example that highlights the success of Nvidia’s drive to capture the AI and DL markets. So far they have been incredibly successful but there is mounting pressure from other accelerator technology providers.


One such example is Graphcore, the UK-based company developing its own brand of general-purpose accelerators known as intelligence processing units (IPU). Graphcore released the second


generation of IPU products in 2020 and is quickly gaining pace, with exciting benchmarks in both ML and scientific computing. There are several examples on the


Graphcore website, for example in the areas of drug discovery and life sciences where the IPU has already been deployed for several different applications. In the example of Bert-Base training, the IPU achieved 25 per cent faster training time at 20 per cent lower power, meaning the algorithm will run faster at a lower cost. Bert-Base inference training against a V100


www.scientific-computing.com


s artificial intelligence (AI) and deep learning (DL) technologies mature, there are increasing


sdecoret/shutterstock


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42