search.noResults

search.searching

saml.title
dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
LABORATORY INFORMATICS GUIDE 2022


journey, but rather something that will be continuously iterative as people, processes and technology change. ‘Just moving our systems out of paper is not enough to call ourselves “digitised”. At the same time, the biggest trap for the future is letting “good enough” be the enemy of perfect during the journey,’ Dorsett maintained. ‘Small iterative changes can be more effective in moving the organisation forward than large, burdensome implementation projects. The seamless flow of high-quality data, from producers to consumers across the organisation, is the real point of a holistic, digital transformation strategy.’ For many companies and


organisations, the concept of digital transformation has progressed well beyond the practicalities of reducing paper and manual data entry. The goal now is to achieve true seamless lab connectivity and data harmonisation, to maximise data context, utilisation and longevity. But achieving this goal of potentially enterprise-wide digitalisation will hinge on realising plug-and-play integration of all lab informatics systems, instruments, devices, and people and organisations, suggested Geoff Gerhardt PhD, chief technology officer at Scitara. Established two years ago, Scitara


specialises in the development of laboratory-specific, cloud-based software solutions and tools that facilitate digital transformation by enabling connectivity for life science and other industries. ‘The Scitara DLX platform enables plug-and-play connectivity between any device, instrument, application, informatics system, web service or lab resource, on a vendor-neutral basis,’ Gerhardt noted. ‘On top of this interconnectivity, the DLX platform facilitates multidirectional data exchange and offers the tools and libraries that allow the transformation of data into required formats, in-flight.’


Cohesive integration Scitara’s aim is thus to move away from the concept of creating bespoke integrations between individual pieces of hardware and software. ‘Our approach is based on the formation of a cohesive data exchange, so you don’t have to reinvent the wheel every time you want a new integration for your LIMS system, data lake or ELN.’ Another benefit of this cohesive strategy is the lab becomes more easily


www.scientific-computing.com


‘expandable,’ and can diversify, Gerhardt suggested, as new instruments or informatics systems can be added and integrated using a relevant connector. And importantly, the Scitara platform is founded on an open framework, so third parties can also develop and add new plug-and-play connectors for instruments and software. Digital transformation should enable


better use of data, without losing or skewing context, he suggested. This means more than just implementing standards for data or communication between lab systems. ‘Scitara is developing the mechanisms that will make it relatively straightforward to harness the full value of data, and that means finding, analysing, extracting and, if necessary, transforming data into a format required by the platforms to which it is transferred. Complete connectivity and retention of context flow will mean labs can push and retrieve contextualised insight into, and out of, data lakes, which can then be better mined and interrogated.’ The laboratory has lagged in other industries in its drive to generate this ecosystem of connectivity, Gerhardt suggested. ‘We enjoy all manner of integration in the consumer world; PayPal integrates seamlessly with my bank. In our homes, digital light switches, thermostats and door locks from different vendors can work together.’ So why has the laboratory dragged its heels behind other industrial sectors? Historically, lab digitalisation efforts tended to be vendor-specific, and developing point-to-point integrations for individual instruments was perhaps the norm, Gerhardt said. ‘This may have been partly due to regulatory pressures. For regulated industries, there may not be a huge incentive to modernise because, when changing your architecture, you must then revalidate your instruments and software for regulatory compliance. Scitara is working with software and


hardware manufacturers to establish a framework for that pan-laboratory connectivity between the instruments, key platforms such as electronic laboratory notebooks, web services and analytical tools, including AI and ML. ‘We work with them so their products can participate in the exchange. This opens the way to lab interconnectivity, communication and the exchange of data and instructions.’ In addition to the development of


consumed by the organisation, not to implement


systems per se “


its integration platform, Scitara has generated an orchestration layer that allows users ‘in a user-friendly, drag-and- drop interface type of way’, to create automated, event-driven lab workflows, Gerhardt explained. So, when an ELN, for example, requests a balance reading, this triggers a cascade of events as an automation. ‘The user will be notified to take the balance reading which, once taken, will be published as a new event, and the resulting data and metadata then moves back into the ELN,’ he explained. ‘And then it becomes feasible to execute this workflow seamlessly.’


Transforming data Keeping data in an inherently adaptable format, such as JSON, also makes it possible to transform data, agnostic to the original format, Gerhardt noted. ‘Data can be transformed, in flight, to the format required by its destination, whether that be an ELN, or an artificial intelligence and machine-learning tool. Rather than imposing standardisation on everything, our approach allows this flexible data transformation, which is enabled by taking the data out if its native format, putting it into a friendly format, such as JSON, and then transforming that data into the shape that your destination application, such as an AI tool, can ingest.’ One of the major problems with applying data-driven R&D is you cannot go straight to unsupervised learning models, commented Max Petersen, AVP of chemicals and materials marketing at Dotmatics. ‘You need to have some kind of supervision to train your algorithms to explain and demonstrate the data that contributes to a positive outcome, and that relies on complete, clean data.’ The ability to derive end-to-end, clean and insightful data for that learning model by interrogating a complete ecosystem


5


address the data created and


The goal is to “


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38