search.noResults

search.searching

dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
Management, Analysis, and Simulation of Micrographs with Cloud Computing


Shawn Zhang,1 * Alan P. Byrnes,2 Jasna Jankovic,3 and Joseph Neilly4


1DigiM Solution LLC, 67 South Bedford St., Suite 400 West, Burlington, MA 01803 2Whiting Oil and Gas Corporation, 1700 Broadway, Suite 2300, Denver, CO 80290


3Materials Science and Engineering Department, University of Connecticut, 97 North Eagleville Rd., Storrs, CT 06268 AbbVie Inc., NCE Analytical Chemistry, 1 N. Waukegan Rd., North Chicago, IL 60044


*shawn.zhang@digimsolution.com


Abstract: This article discusses the image processing challenges in modern microscopy and microanalysis associated with large dataset


size, microstructure complexity, and growing computing


requirements. Solutions for meeting these challenges include artifi- cial intelligence and cloud computing, which provide improved effi- ciency in managing microscopy data, more robust automated image segmentation, and prediction of physical properties from images with user-friendly high-performance computing. The applications of these technologies in the industrial research environment are exemplified by studies in evaluation of amorphous drug formulations, tight rock characterization for sustainable hydrocarbon extraction, and low- temperature fuel cell design for an environment-friendly automobile.


Keywords: Correlative imaging, artificial intelligence, image quantifi- cation, image-based simulation, cloud computing,


Introduction Modern microscopy provides extraordinarily useful data


about life science and materials science. As more capabilities are available in microanalytical instruments, the data being acquired, managed, and analyzed grows exponentially. For example, some of the following oſten occur in a micro-


analysis workflow: (a) you conducted a number of image pro- cessing steps but forgot what you did on which saved image, or (b) you struggle with image segmentation because the auto- matic thresholding did not work, or (c) you launched a compu- tation to finish over the weekend, but found on Monday that there was an operating system update that killed your session during a computer reboot, or (d) important images collected years ago were difficult to find. Tis article describes some solutions to these difficulties. Te problem. Recent advances in microscopy, such as


improved resolution, new analytical capabilities, and the addi- tion of the third spatial dimension, provide insights for both fundamental research and various industrial applications. Te rapid adoption of advanced microscopy, however, has led to an explosion of imaging data. Different imaging methods, such as micro-computed X-ray tomography (MicroCT) and scanning electron microscopy (SEM), have different resolution capabili- ties and various fields of view (FOV). Analysis using multiple imaging methods at various resolutions (that is, correlative imaging) is oſten needed to solve complex problems. Further- more, a sequence of processing and analysis steps is typically required. Each step creates new data that is either similar or larger than the size of the original image. For example, a three- dimensional (3D) MicroCT dataset oſten has 2000 images, with each image having 2000×2000 pixels with 16-bit grayscale


26 doi:10.1017/S1551929519000026


intensity values. Tis dataset requires approximately 16 giga- bytes (GB) of storage. A median image filter, for enhanced signal-to-noise ratio, will create a processed dataset that has the same size as the original data. However, an autocorrelation function, describing how well a microstructure phase corre- lates with itself when it is displaced in all possible directions, hence capturing dispersion patterns of the phase such as peri- odicity, produces data in 32-bit floating point values, which is twice the size of the original data, or 32 GB. Ultimately, in this simple, two-step microanalysis workflow, a total of 64 GB of data is generated. Practical microanalysis workflows oſten require more than two processing steps. Table 1 summarizes


Table 1: Steps of a typical image-analysis workflow and the corresponding data increases.


Data increase factor


Category Imaging


Analysis steps


Multiple imaging modality on the same sample Multiple resolutions using the same modality Multiple FOVs on the same resolution


Recording pixel depth


8-bit 16-bit


16-bit color (64-bit total) 32-bit 64-bit


Image processing


Cropping/transformation Noise reduction Artifact removal Segmentation


Morphological operations


Image analysis Autocorrelation function Particle size distribution


Image-based simulation


Data management www.microscopy-today.com • 2019 March


Transport properties Mechanical properties Thermo properties Electrical properties


Archiving and backup 2–3 4–8 2–4


(multiples of original)


2–3


(Formerly of Automotive Fuel Cell Cooperation Corporation, 9000 North Fraser Way, Burnaby, British Columbia V5J 3J8, Canada) 4


2–4


2–3


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52  |  Page 53  |  Page 54  |  Page 55  |  Page 56