MICROSCOPY & IMAGING
LEFT: A sample view of ARM through the lens with a lymph node metastasis detection model RIGHT: An Olympus BX43 microscope with the augmented reality and AI module
“Finally, many deep learning algorithms
for microscope images were developed using other digitisation methods, such as expensive whole-slide scanners with customised optics in histopathology. We show that these algorithms generalise to images captured via video output from a standard microscope. Tese three core capabilities enable the seamless integration of AI into a traditional microscopy workflow,” adds Chen.
Te integrated AI is a deep learning algorithm, developed using large annotated datasets to identify breast and prostate cancer. Te AR is driven by high- definition multimedia from a computer and visualised directly in the microscope eyepiece via a micro display mounted on the side of the microscope. Te microscope itself is a standard binocular microscope that has been modified to include a video output to the computer and the attached micro display. For Chen, the main advantage of combining AI, AR and microscopy technology for cancer diagnosis is that the cancer diagnosis workflow via a standard microscope remains unchanged. In order to achieve this, he reveals that the ARM system satisfies three major design requirements – namely ‘spatial registration of the augmented information, system response time and robustness of the deep learning algorithms.’ Firstly, he points out that AI predictions such as tumour or cell locations need to be precisely aligned with the specimen in the observer’s field of view (FOV) to retain the correct spatial context. Importantly, this alignment must be insensitive to small changes in the user’s eye position relative to the eyepiece – parallax-free – to account for user
movements. Secondly, although the latest deep learning algorithms often require billions of mathematical operations, these algorithms have to be applied in real time to avoid unnatural latency in the workflow. Tis is ‘especially critical’ in applications such as cancer diagnosis, where the pathologist is constantly and rapidly panning around the slide.
ROBOTIC SYSTEM Elsewhere, teams led by Dr Tomas Marchitto at the University of Colorado, Boulder and Michael Daniele at North Carolina State University (NCSU) are currently collaborating on a novel project to develop a robotic system capable of imaging, identifying and sorting microscopic fossils known as foraminifera, or forams for short. As Edgar Lobaton, associate professor in the Department of Electrical and Computer Engineering at NCSU, explains, analysis of such specimens can provide scientists with insights into the biodiversity and conditions of the ocean when the forams were alive. So far, the team has demonstrated that machine learning can be used to identify six different species of forams from their images with performance comparable to that of human experts – and it is currently in the process of developing a robotic system that will handle the imaging and sorting automatically, which will enable
The system set-up at NCSU
www.scientistlive.com 55
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44 |
Page 45 |
Page 46 |
Page 47 |
Page 48 |
Page 49 |
Page 50 |
Page 51 |
Page 52 |
Page 53 |
Page 54 |
Page 55 |
Page 56 |
Page 57 |
Page 58 |
Page 59 |
Page 60 |
Page 61 |
Page 62 |
Page 63 |
Page 64 |
Page 65 |
Page 66 |
Page 67 |
Page 68 |
Page 69 |
Page 70 |
Page 71 |
Page 72