LABORATORY INFORMATICS
Lake Maggiore, northern Italy
“We need to move from big data to big analysis. It’s about making big data small data; it’s about making big data usable data”
traditional laboratory operations. As Gerhard Noelken, business development at the Pistoia Alliance, highlighted in his talk which focused on technologies such as IoT, machine learning and blockchain to create the ‘Lab of the Future (LotF)’ one of Pistoia’s key themes for its work in 2018. Noelken suggested that while
technology may be adapted quickly by consumer markets it takes significantly longer in the laboratory. To help accelerate this process LotF was chosen as one of the strategic themes for 2018 in order to provide pre-competitive support for more rapid implementation of value adding components to today’s laboratory environment.
Another point noted by Nolken was that
the topics chosen by the Pistoia Alliance reflected problems or challenges that were seen by Pistoia ‘again and again.’ There are several short and long term
areas of focus for the LotF project. Blockchain, IoT, augmented reality and the automation of intelligent systems are immediate goals while the future will see the focus shift towards AI and eventually quantum computing and virtual research. As an example of how this might drive
thought. Newton then informed the audience that from 2015-2017 the FDA issued 130 warning letters with each representing findings of multiple, serious or critical infractions. Newton also highlighted a report from 2013 published by National Center for Biotechnology Information that stated that ‘nearly 40 per cent of researchers knew of fraud but did not report it.’ During the talk newton noted that most loss of data integrity ‘happens at the point of collection’ but ‘any place where humans interact is an integrity risk point’. As with the other speakers Newton highlighted several ways to alleviate the risk for incorrect or fraudulently entered data but much of this revolves around automating the process and removing the potential for human error or fraud. Here is another example of how
blockchain could help to increase data integrity in the laboratory. While without systems integration if data or transactions are created by manual data entry there is
www.scientific-computing.com | @scwmagazine
still a possibility for fraud it is much easier to detect and prove in a system that is recording a full, unedited record of all data transactions in sequence. The inherent transaction history and
traceability provides complete trail to facilitate regulatory oversight and increase organisations auditability. Pijanowski also noted that ‘distributed
consensus’ can, in some cases, ‘reduce the costs associated with manual data transcription and reconciliation during the data lifecycle.’ This is a technology that is already
starting to make an appearance in the laboratory – particularly in industries that are highly regulated. However, Pijanowski expects this growth to continue citing that ‘In 2018, approximately 35 per cent of life sciences companies will deploy blockchain into their organisations.’
Preparing for the lab of the future However it is not just blockchain technology that threatens to disrupt
new paradigms of research the use of IoT and the creation of huge amounts of genetic data, coupled with AI and more powerful computing systems will be fundamental pillars in the development of precision medicine. However if these technologies are not adopted then this research will be stalled so it is important for companies such as the Pistoia Alliance to help drive adoption of new technology. In the immediate future the plan is to help drive the IoT tech into the lab. ‘We are evaluating the maturity and usability of IoT in an end-to-end use case, sharing actual data across the industry and analysing the data using artificial intelligence tools’ commented Noelken. ‘The second project deals with the role of the analytical method documentation in the Lab. Translating the human readable protocol into an electronic instruction set in a standard way would hugely improve reproducibility and data quality for experiments from discovery to manufacturing’ added Noelken.
April/May 2018 Scientific Computing World 29
elesi/
Shutterstock.com
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40