PROCESS AUTOMATION
AUTOMATING LABORATORY PROCESSES
Aliko Chanda, application scientist, H.E.L Group, shows how developments in laboratory automation will be driven by the ability to integrate multiple devices into a single workflow
avoiding unnecessary experiment repetition. In a lab, prior to automation, it was estimated that pre-analytical processes took more than 60% of an individual’s time, with laboratory errors ranging from 30% to 86%[1]. In redressing these figures, labs have looked for and steadily adopted automation in lab processes. But where does automation go in the
A
future? While we see the continued development of robotic and automated platforms which provide greater precision accuracy or higher throughput, the real power in laboratory automation lies with the control software. Further developments in lab automation will be driven by the ability to integrate multiple devices into a single workflow, with more and more sophisticated built-in tools and functionality, allied with a connection into information databasing and analysis systems. Walk into any laboratory today, and you will
find a range of equipment from different vendors, often independently controlled by unique software platforms. In some labs, the hardware is integrated into a single platform, typically achieved by combining systems from multiple sources and writing an overarching control software system – either inhouse or calling on external IT expertise. Individual pieces of equipment running within the lab will increasingly be found connected
34 NOVEMBER 2021 | PROCESS & CONTROL
utomation in a laboratory is crucial to unlocking cost-saving benefits, relieving staff to work on other projects, and
to central data storage and analysis systems ranging from electronic laboratory notebooks to AI-powered analysis platforms, adding the ability to predict failure and recommend corrective intervention, as well as suggesting overall improvements to experimental protocols.
Common software platform
When a laboratory uses equipment from multiple vendors, there is an inevitable overhead on staff training. Each scientist needs to be adequately trained to use each and every piece of equipment safely and effectively. In many cases, individual systems come with a unique interface design and may use labels or terminology within the software that are not consistent across systems. Indeed, some users may question the very concept of multiple instruments running from a single software due to complications associated with, for example, individual system failures that may occur during a workflow and are difficult to isolate and troubleshoot when part of a wider instrument network. However, overcoming these barriers, it is possible to envisage a common software platform that allows all instruments to be controlled effectively from one interface, with a concomitant reduction in staff training and learning overheads ultimately increasing the efficiency and accuracy of lab processes. Rather than being reserved for the lab of the future, next-generation process automation
software being released now is already introducing this unified approach to process control enabling devices from different manufacturers to be controlled individually within a single environment that integrates each system's API connections and programming interfaces. These devices are grouped into functional units that can conduct a common task, for example, controlling both the internal and external temperatures of a PolyBLOCK system simultaneously, keeping the temperature inside a reactor at the desired value. This would involve concurrent heating and cooling of the reactor throughout an experiment to maintain a single set temperature. Another example of this consolidated
approach is multiple reactors within the same reaction block having different experimental conditions such as stirring speed, temperature control, or pH control pressure. These layers of device control are hidden from the user. The user interacts with a single common interface, an experimental design to set up and monitor their experiments and constantly monitor safe operation conditions. This type of operating interface opens up the potential of pausing and updating workflows mid-experiment, improving laboratory efficiency when small (or large) changes made to an experiment will increase the value of the data created. This approach can also provide the user with an enhanced set of calibration tools – leading to a simpler, more consistent device setup
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44 |
Page 45 |
Page 46 |
Page 47 |
Page 48 |
Page 49 |
Page 50 |
Page 51 |
Page 52 |
Page 53 |
Page 54 |
Page 55 |
Page 56 |
Page 57 |
Page 58 |
Page 59 |
Page 60 |
Page 61 |
Page 62 |
Page 63 |
Page 64 |
Page 65 |
Page 66