and machine learning (ML) for predictive purposes have become commonplace.’

Iterative progress Digital transformation isn’t a linear journey but rather something that will be continuously iterative as people, processes and technology change. ‘Just moving our systems out of paper is not enough to call ourselves ‘digitised’. At the same time, the biggest trap for the future is letting ‘good enough’ be the enemy of perfect during the journey, Dorsett maintains. ‘Small iterative changes can be more effective in moving the organisation forward along the path to digitalisation than large, burdensome implementation projects. The seamless flow of high- quality data from producers to consumers across the organisation is the real point of a holistic, digital transformation strategy that will drive innovation and power your lab of the future.’ For many companies and organisations,

and consumed by the organisation, not to implement systems per se,’ Dorsett stresses. ‘It is not a ‘buy and deploy it’ project: no vendor sells what is required to operate as a data-driven organisation.’ Accessibility of data is, Dorsett

continues, ‘everything!’ Achieving a state of ‘analytics-ready’ is about R&D data flow throughout every aspect of your operations. ‘Siloed technologies and manual processes impede the flow and accessibility of data. Achieving seamless R&D data flow also has cultural requirements (data governance) and technical requirements (semantic tooling).’ Exponential growth in R&D data has, in addition, generated immense interest in whether and how R&D data may be used more effectively. ‘Efforts to become data- centric to enable ‘running with algorithms’ and the use of artificial intelligence (AI) | @scwmagazine

the concept of digital transformation has progressed well beyond the practicalities of reducing paper and manual data entry. The goal now is to achieve true seamless lab connectivity and data harmonisation, to maximise data context, utilisation and longevity. But achieving this goal of potentially enterprise-wide digitalisation will hinge on realising plug-and-play integration of all lab informatics systems, instruments, devices, and people and organisations, suggests Geoff Gerhardt PhD, chief technology officer at Scitara. Established two years ago, Scitara specialises in the development of laboratory-specific, cloud-based software solutions and tools that facilitate digital transformation by enabling connectivity for life science and other industries. ‘The Scitara DLX platform enables plug-and- play connectivity between any device, instrument, application, informatics system, web service or lab resource, on a vendor-neutral basis,’ Gerhardt noted. ‘On top of this interconnectivity, the DLX platform facilitates multidirectional data exchange and offers the tools and libraries that allow the transformation of data into required formats, in flight.’

Cohesive integration Scitara’s aim is thus to move away from the concept of creating bespoke integrations between individual pieces of hardware and software. ‘Our approach is based on the formation of a cohesive data exchange, so you don’t have to reinvent the wheel every time you want a new integration for your LIMS system, data lake or ELN.’ Another benefit of this cohesive strategy is that the lab becomes

”The goal is to address the data created and consumed by the organisation, not to implement systems per se”

more easily ‘expandable,’ and can diversify, Gerhardt suggested, as new instruments or informatics systems can be added and integrated using a relevant connector. And importantly, the Scitara platform is founded on an open framework, so third parties can also develop and add new plug-and-play connectors for instruments and software. Digital transformation should enable

better use of data, without losing or skewing context, he suggests. And this means more than just implementing standards for data or communication between lab systems. ‘Scitara is developing the mechanisms that will make it relatively straightforward to harness the full value of data, and that means finding, analysing, extracting and, if necessary, transforming data into a format required by the platforms to which it is transferred. Complete connectivity and retention of context flow will mean labs can push and retrieve contextualised insight into, and out of, data lakes, which can then be better mined and interrogated.’ The laboratory has lagged other industries in its drive to generate this ecosystem of connectivity, Gerhardt suggests. ‘We enjoy all manner of integration in the consumer world. PayPal integrates seamlessly with my bank, in our homes digital light switches, thermostats and door locks from different vendors can be made to work together.’ So why has the laboratory dragged its heels behind other industrial sectors? Historically, lab digitalisation efforts tended to be vendor-specific, and developing point-to-point integrations for individual instruments was perhaps the norm, Gerhardt says. ‘This may have been partly due to regulatory pressures. For regulated industries there may not be a huge incentive to modernise because, when changing your architecture, you must then revalidate your instruments and software for regulatory compliance. Scitara is working with software and

hardware manufacturers to establish a framework for that pan-laboratory connectivity between the instruments, key platforms such as electronic laboratory notebooks, web services and analytical tools, including AI and ML. ‘We work with the vendors and write the connectors

Summer 21 Scientific Computing World 27


Sergey Nivens/

Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42