laboratory informatics
Breathing new life into medical research
Sophia Ktori looks at the impact of AI on life
science research N
eural networks have been around since the 1970s, but within the last five years the advent of digitised data, the development of complex
algorithms, and ease of access to compute power, have allowed neural network technology to advance almost exponentially, suggests Abdul Hamid Halabi, global business development lead, healthcare and life sciences at NVIDIA. ‘What neural networks have done is take
the legwork out of ‘teaching’ machines. For the uninitiated, and at a basic level, it means that we don’t have to plug in every single piece of information into the system. Rather than looking at 600 pathology slides and marking every feature of every cancer cell that you want a computer to recognise, we can just create a large dataset labelled cancer and non-cancer, and give it to the computer to work out.’ In fact, the life science and healthcare sectors have been leveraging machine learning for more
18 SCIENTIFIC COMPUTING WORLD
than two decades, in areas such as computational chemistry and image analysis, comments Nick Lynch, Pistoia Alliance investment lead and one of the founders at the Pistoia Alliance. ‘For real world applications the use of artificial intelligence (AI) within these and other areas of healthcare and life sciences has been boosted enormously by the rapid evolution of deep learning neural network platforms and toolkits such as Google Tensorflow, which have spawned a rich environment of algorithms and platforms.’ Recent reports suggest that over $1.5 billion
dollars have been invested in AI healthcare start- ups in the last five years, Halabi notes. ‘In the five year period from 2012 to 2016 there were a total of about 1,000 papers published on healthcare and deep learning. Tis year alone we are on track to publish another 1,000.’
AI promises huge potential for life sciences research Break down the overall healthcare environment into three main sectors, and the potential breadth of applications for AI become evident, Halabi points out. Te first area is screening, where AI could be used to analyse imaging, lifestyle and other health data, to predict or detect disease
USING DEEP LEARNING TECHNIQUES WE CAN TRAIN COMPUTERS TO IDENTIFY SUSPICIOUS CELLS AND REDUCE FALSE POSITIVES
early, so that preventive or curative measures can be taken. ‘Using deep learning techniques we can train computers to accurately identify suspicious cells and reduce false positives, for example.’ AI could also replace physicians in remote
geographical areas, or in regions where there are not enough trained personnel, such as radiologists to carry out screening programmes. ‘You can send scanners out to these areas with relatively untrained technicians, and let AI do the diagnostic work. Tis is already a reality in Korea, where neural networks have been trained to recognise tuberculosis in chest x-rays, so that patients who screen positive can automatically be called in for treatment. ‘Similarly, a group at Stanford University has
trained a neural network to recognise skin cancer lesions. Imagine putting this capability into a
@scwmagazine l
www.scientific-computing.com ➤
CNStockck/
Shuterstock.com
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32