search.noResults

search.searching

note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
LABORATORY INFORMATICS


laboratory development Robert Roe reports on the Paperless Lab Academy event held in Milan on the bank of Lake Maggiore


Managing a new era in


The Paperless Lab Academy conference held this year in Milan, Italy celebrated its sixth year in 2018. With the conference held in a hotel on the bank of Lake Maggiore it was the first time the event had been held in Italy after three years in Barcelona. The event kicked off with the usual


introduction from organisers Isabel Munoz and Roberto Castelnovo, who welcomed the attendees and highlighted some key facts including a 10 per cent increase in the number of visitors. The first keynote was delivered by Pat


Pijanowski, managing director at Accenture Scientific Informatics Services (ASIS), and Dr Matt Ellis, senior manager, ASIS Europe on the potential for blockchain technology to be used in the laboratory. Pijanowski opened with quotes from


various sources highlighting the hype behind this technology in recent years quoting The Economist, who said that the technology was ‘a catalyst for global prosperity’ and Gartner who reported that ’10 per cent of global GDP will be on blockchain in 10 years. While these claims refer to the use of


blockchain as a technology underpinning the development of cryptocurrencies, the technology has other potential use cases that can help to increase the security and traceability of data. In the same way that blockchain can help to produce a secure and distributed record of transactions the technology can also be applied to laboratory systems. Blockchain allows for the creation of a


chain consisting of blocks of transactions, the order of transactions is represented by their order in the chain. Transactions cannot be modified and each participant or node keeps an identical copy of all transactions so new data can only be added by consensus between nodes.


What does this mean for the laboratory? In simple terms blockchain can be used to create a distributed database of data transactions that is secure and cannot be


28 Scientific Computing World April/May 2018


tampered with or edited after creation. This provides regulatory authorities such as the FDA with a full, chronologically ordered record of all experiments and data recorded in a laboratory for a given project. Pijanowski did note that was not


necessary for all scenarios but highlighted two potential use cases where the technology could be applied – externalisation and increased data integrity. In the case of data integrity, Pijanowski


stated that ‘replication of data across all nodes provides a reliable source of truth.’ While a network of nodes can self-verify all transactions an orgnaisation could also introduce ‘smart contracts that could be used to apply business logic and process transactions in near to real-time.’


The cost of data The following presentation, from Eric Little, chief data officer Osthus, helped to clarify the importance of recording and storing data correctly in order to maximise its use and therefore its value to an organisation. Little provided a strategy for reference


master data management ‘Meeting the Challenge to support analytics in the e-data lifecycle’ Little noted that in a world of


increasingly costly and difficult research and development pharma companies must reduce costs through better use of data that is generated within an organisation. Little stated that companies should


‘use the data you have before you generate more.’ This can be accomplished by reducing the number of re-run experiments system integration and standardisation. Little also noted that once data is


available an organisation should ‘automate as much as possible’ starting with simple reoccurring tasks such as workflows, models, query patterns and then expanding the process.


“Replication of data across all nodes provides a reliable source of truth”


‘We need to move from big data to big analysis’ stated Little. ‘It’s about making big data small data; it’s about making big data usable data.’


The importance of maintaining data integrity If costs and productivity alone do not make the case for drastic new approaches to the way in which data is recorded and stored then these changes will be enforced through regulation. One example of this can be found in the


way that new GPDR regulation will affect patient and medical data but looking into the future Mark Newton, from Heartland QA, suggests that fraud and data integrity issues may result in significant changes in the way that research data is collected, processed, and retained. Newton opened his presentation with


some facts regarding fraud in scientific research. He highlighted several examples and one article in 2017 reporting that the scale may be much larger than originally


@scwmagazine | www.scientific-computing.com


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40