search.noResults

search.searching

saml.title
dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
LABORATORY INFORMATICS GUIDE 2022


experimental methods equivalent, for example, or does a sample ID from your LIMS match a sample ID from a CRO?’ Dorsett said. It’s important to understand what


tools are used at the level of the lab, facility and enterprise as the basis for working out how to maximise use of that collective investment, identify key gaps and define a longer- term roadmap that recognises the importance of sustainability and total cost of ownership. ‘You want to find ways of using integration technologies that are already there more effectively, as well as to bring in new technologies,’ Dorsett said. For instrument integration, there are middleware companies which are positioned to offer specific software to facilitate instrument integration, Dorsett suggested, citing SmartLine Data Cockpit, TetraScience and BioBright, the latter having been acquired by Dotmatics in 2019. ‘These companies are focused on providing tools that can address how people gather all their data from the different instrumentation,’ said Dorsett.


Consolidating data Any rounded conversation on LIMS/ ELN and software integration and management will at some point likely come around to the concept of data lakes, noted Robert D Brown, vice president, product marketing at Dotmatics. ‘The initial concept was that data lakes could house all the lab-derived data, and people could then dip in and retrieve what they needed, when they needed to.’ But in a real-world setting there are two types of data, he explained. ‘You have the structured data, such as data in your ELN, but then you will also have the vast numbers of unstructured data files that are being generated by all of this automated instrumentation labs now use. Typically, these files may be output in proprietary, non-standardised formats, and the data contained will first need to be parsed out before being put into a LIMS, or an ELN.’ But with the right tools in place, we


can now have the best of both worlds. ‘We can have hybrid systems where the unstructured data lives in the lake, and the structured part of that data can then go into the ELN. As long as both sides have a good API, and you have a way of parsing the data, then its possible to overcome most technical


18


hurdles. The trick is to link the two types of data appropriately.’ From a software perspective, the


ability to work with both biologics and small molecules using the same overall platform is founded on the use of software components that can be slotted together in multiple ways to establish the right workflow for the right outcome. A chemist may do things in one order, but a molecular biologist might do them in another. ‘That’s the real trick,’ he continued, ‘to be able to put the different pieces of the same overall solution together so they match the workflow for the different scientists.’ And the next stage in software


evolution will at least in part – and perhaps inevitably, Brown noted – focus on integrating AI into the everyday lab function. ‘First, you have to add AI and machine learning into your software stream,’ and this is more procedural, he indicated. ‘But critically – and this is perhaps the biggest problem – it’s imperative that you are getting absolutely clean data into those ML models in the first place. If you don’t put clean data in, you will get garbage out.’


This brings us back to the concept of data automation, so you don’t have to use humans to move data around, which will at some stage run the risk of human errors in data manipulation, Brown said. Automating data generation, management and transfer will also facilitate that ability to pull complete datasets across from any system, into the ML learning pool. ‘And here is where we have the advantage of the BioBright lab


As well as solving the integration challenge





scientists, we want to make sure it can also be achieved at scale “


automation solution, which automates the complete process of getting data off instruments into the lake, parsing it, and putting it into the notebook. Compare this with the requirement for human transfer of files between systems, and the manual inputting of data, and sharing spreadsheets, which is inherently error-prone.’ With this goal of complete round-


trip automation in mind, Dotmatics announced a partnership with HighRes Biosolutions, which designs and builds robotic systems and laboratory devices, in January. The collaboration focuses on marrying high-throughput laboratory automation capability of the Dotmatics ELN with the HighRes instrument control software Cellario. This combination frees scientists to plan experiments, run individual instruments and publish and analyse data in a single software interface.l


www.scientific-computing.com for individual


alphaspirit.it/shutterstock


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38