search.noResults

search.searching

saml.title
dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
LABORATORY INFORMATICS g


to meet the needs of other users, without the need to create separate versions.’ ELN and to a certain extent LIMS


systems, have historically been bottlenecks to contextual data transfer, Brown noted. ‘A lot of these platforms were effectively places where data went to die. They were fantastic at getting data, collecting it and storing it, but then scientists had problems getting it back out,’ Brown said. And whereas in the past the real


value of the ELN was in recoding the experiment, an ELN today is also a valuable resource that is used to inform on prior experiments and help to direct forward studies and decision making. ‘Data in the ELN will let scientists see which experiments have been carried out by someone else in the organisation, and what the results were, or to identify optimal conditions from similar experiments carried out historically, to help in the design of new experiments going forwards.’ However, as the complexity and


throughput of data has exceeded what a scientist can handle manually, a pressing problem that has now emerged, is how to get all the data and metadata into these platforms, without loss of depth or context. To solve this issue, Dotmatics acquired Boston-based company BioBright last year. ‘We needed to help customers solve just that “data-in” problem, and through the BioBright acquisition we now have both the “in”, and “out”, pieces of the data automation cycle,’ Brown noted. The BioBright platform has effectively


given Dotmatics what it says is a unique combination of lab data capture, data processing, ELN and data analytics capabilities. Brown said: ‘Using the Dotmatics platform, all data coming from instruments, or from external partners and contract research organisations is channelled into the centralised informatics platform. This means scientists can more easily affect end-to-end workflows and access data seamlessly.’


A holistic approach Reed Molbak, product manager at Benchling, said the ability to integrate and interface the lab holistically will be matched by the need to keep up with massively increased laboratory throughput – related to ‘omics’ technologies. ‘The ability to maximise intelligence


from this increased throughput in physical experimentation and data generation will hinge on the ability to channel complete,


22 Scientific Computing World Winter 2021


and integrate multiple pieces of software at the desktop and in-house server level, Molbak noted. ‘And for even relatively small companies, there will almost inevitably be multiple software platforms running, from different vendors, and for different instrumentation, that will need to interconnect.’


“Solutions can work alongside scientific workflows focused on antibody discovery, peptides, nucleic acids, antibody-drug conjugates or chemically-modified biologics”


contextual data and metadata through machine learning and AI,’ Molbak noted. ‘Such algorithms are increasingly being developed to interrogate and analyse the results from genomic and proteomic analyses.’ Increased throughput has not only


impacted the data coming out of assays but necessitated the development of a new generation of laboratory automation to enable those high throughput assay and screening workflows, Molbak also noted. ‘Functions such as assay plate preparation, which would traditionally have been carried out by a scientist or technician, is now in the hands of liquid handling and other robotic systems that can keep up with this assay throughput.’


The modern lab environment In today’s lab environment, ‘success’ may now depend on being able to combine the ability to automate and so increase physical throughput, as well as optimise data management, integrity and utilisation for intelligence. And so we come back round to AI and ML, Molback noted. ‘Organisations may be developing their own algorithms and AI techniques, and so they look for a LIMS/ELN system that will work in that environment.’ The ability to connect hardware


and software in the cloud is taking the pressure off companies having to install


The Benchling R&D Cloud has been designed to facilitate laboratory interconnection and communication from discovery through to bioprocessing. ‘We have invested a lot of effort in enabling that imperative for instrument integration and interfacing, and also integrating the Benchling solution with other databases and data warehouses. Our goal is to enable our clients, who may have unique or proprietary techniques supporting their workflows, to be able to carry out specialised analyses, and support the management of data for ML and AI analysis and interrogation.’


Enabling laboratory unification The differentiator for Benchling, Molbak believes, is enabling laboratory unification, and providing a holistic landscape for experimentation and data management. ‘In my experience those co-ordination


problems are some of the hardest when you’re trying to knit together, say, an ELN and a LIMS system, and then possibly additional systems that interface with the lab instrumentation. Everything within the Benchling cloud environment is built on one database and uses one set of data models. For example, the plasmids that one team might design are held in the same records as the results, and all notebook data and all inventory data. It is all automatically centralised, no matter which part of our product you use, including our LIMS and ELN tools. There is never any need to try and interface different systems. What this means is that it then becomes much easier, to easy, to run contextual data through ML, because nothing gets left behind.’ Benchling is investing time and


development in its lab automation and developer platform. ‘We can’t necessarily predict where


the science is going to go, but we are definitely keeping an eye on it, to make sure the Benchling platform will continue to integrate physical hardware and software,’ Molbak said. ‘We are also focused on ensuring we can match the increasing scale of data generation. I don’t think anyone could have predicted, even two to three years ago, just how fast-paced the growth in experimental throughput would be.’


@scwmagazine | www.scientific-computing.com


Shutterstock.com


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34