can cause jitter, where the latency is dynamic or, in the worst case, causes packet loss. T e key to delivering the speed and reliability needed for remote microscopy is a high-quality connection, with low latency and minimal jitter and packet loss. If the latency is consistently low enough, then the user can sit in a remote site anywhere in the world and feel a suitable response to operator commands. High-speed connections . T e Ohio Academic Resources Network (OARnet, ) and Internet2 ( www.

Figure 3 : Image of the operation station installed in the Air Force Research Laboratory at Wright-Patterson Air Force Base in Dayton, Ohio.

to the UK science minister during a visit to the Materials Department in London but with only limited microscope functionality [ 10 ]. To our knowledge, the first implemen- tation of a classroom environment with “live and interactive” microscope control was at Carnegie-Mellon University where De Graef et al. designed and implemented a remote laboratory where microscopes were “hardwired” to multiple stations to allow groups of students to be trained on SEM techniques [ 11 ]. While this approach was effective locally, the approach was not scalable beyond the host institution. The topic has remained of interest to the community, and Sinclair and others have organized series of international meetings to discuss this topic and identify best practice [ 12 ]. Time delay . At CEMAS the fi rst step was to identify the technical challenges that needed to be overcome. It quickly became apparent that reliable, high-speed communication between CEMAS and the remote station was an absolute requirement. Many researchers have experienced an interna- tional call where there is a time delay in the connection. Such a delay, especially in the midst of detailed technical conversation, can be very frustrating. Now imagine that same delay introduced into the control of a microscope. When the operator adjusts the focus or makes an alignment correction while looking at the output of the detector on the computer screen, he or she expects the response on the screen to be instantaneous. If the operator is sitting in a room 200 miles away and there is any delay in the response, operating the instrument quickly becomes ineffi cient and execution of a complex experiment becomes almost impossible. In this scenario, most CEMAS users will eventually conclude, “Forget it, I’m going to jump in the car and drive to Columbus.” Network latency . T e network communication required for remote microscopy must have two characteristics. First, it needs relatively high bandwidth because each image may contain many megabytes of data. But it needs more than just simple bandwidth. Networks make connections through switches, routers, and fi rewalls. T e network path length and each device introduce delay. T is delay is referred to as the network’s switching latency. Other factors such as congestion

2018 September • ) provided the solution to this challenge ( Figure 2 ). CEMAS has a direct connection to the OARnet, a 100-gigabit- per-second backbone that runs throughout the state of Ohio. OARnet was designed to move big data quickly and effi ciently with minimal latency. It provides high-speed/low-latency connections to every major city in Ohio. Beyond Ohio, OARnet connects to Internet2, a nationwide high-speed network. First remote access partners . Having identifi ed an appropriate network solution, the next step in developing a remote microscopy capability was to prove the concept through a practical demonstration with a remote partner willing to work through the development process. T e University of Dayton (about 90 miles from Columbus) and the Air Force Research Laboratory at Wright Patterson Air Force Base, near Dayton, agreed to participate. Working with FEI Company (T ermoFisherScientifi c), the development team designed a remote-control station that replicates a set of microscope controls, without the microscope. T ere are a variety of video encoders and decoders and other hardware and soſt ware that maximize the speed of the signal and handle all the diff erent detectors. Aſt er a thorough process of debugging and optimization, the fi rst remote-control station was installed at the Air Force Research Laboratory ( Figure 3 ), and a few weeks later the second went to the University of Dayton [ 1 ]. Both remote-control stations are currently in use. Users send their specimens by courier to the microscopy facility in Columbus. When the sample is in the microscope, the user receives an indication on the network that the system is ready to go, and then they can use the microscope in the same manner as if they were sitting in front of it.

Expanding the program . Having successfully proven the concept of functional remote microscopy, researchers at CEMAS are now discussing expansion of the project with federal funding agencies. T e National Science Foundation (NSF) is very familiar with the challenges facing programs such as the Major Research Instrumentation scheme and is keen to see how the capabilities at CEMAS can be developed and extended. Ultimately, this approach could help NSF deliver higher research impact with a limited capital budget. For example, with NSF support a remote system was success- fully installed at North Carolina Agricultural and Technical University (N.C. A&T) in June of 2018. T is system, which is the fi rst to be installed outside of the OARnet network, is able to control the instruments at CEMAS with the same high-quality operational standards as the direct OARnet connected systems at the University of Dayton and the Air Force Research Laboratory.

Ohio State University and N.C. A&T, an HBCU institution with an outstanding track record of addressing diversity in


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52  |  Page 53  |  Page 54  |  Page 55  |  Page 56  |  Page 57  |  Page 58  |  Page 59  |  Page 60