LABORATORY INFORMATICS Standardising access to data
ACD labs helps lab users increase productivity through the data standards and characterising the interpretation of data, writes Sophia Ktori
ACD/Labs develops and commercialises software solutions for scientists working with small molecules in chemical, biochemical, and pharmaceutical R&D. The firm has specialist expertise in the development of software that allows organisations to digitally assemble analytical, structural and molecular information for what it describes as ‘effective decision-making, problem- solving, and product life cycle control.’ Andrew Anderson, VP business
development at ACD/Labs, says the way that analytical data can be accessed and utilised still represents a major challenge at every stage of a product life cycle, and this can impact how that data is used to inform decision making. The ultimate aim of any digitisation initiative in the pharma or biotech arena is to develop better, safer and more effective drugs faster, with less attrition, and lower overall costs. ‘At the most fundamental level you can increase the velocity of your product life cycle, and you are doing this partly by reducing your individual unit operations within that process flow of making decisions, and facilitating regulatory approval by reducing the complexity and costs, associated with preparing responses and accessing relevant information.’
Providing decision support However, even with all the investment that has been made in areas such as data standardisation, significant bottlenecks still exist in the decision support arena, he maintains. ‘Take the example of GMP drug manufacture for clinical trials. Each lot is rigorously tested, but if a purity issue arises and batches don’t pass QC, there will be an investigation, part of which will be comparative, and the results of that investigation will inform next steps, and potentially help to prevent future issues.’ The failed batch will be analysed and that
analytical data compared with a reference standard for compound purity, and with data from historical batches that do meet quality standards. ‘What happens from a
18 Scientific Computing World February/March 2020
interpretation of the analytical dataset, such as detected peaks above a certain noise threshold, or attributes of those peaks.’ Generate and manage data that is of sufficient fidelity, and that has an appropriate level of granularity. It becomes possible to use machines to compare one dataset with every other dataset across the supply chain, so that you can identify trends and potentially pre-empt QC issues before they impact on productivity, Anderson suggested. ‘This is an area where we’ve been
“The way that analytical data can be accessed and utilised still represents a major challenge at every stage of a product life cycle ”
data perspective is that people responsible for undertaking that comparison will have to go on a data scavenger hunt.’ And this is no simple task, he stated. ‘Relevant chromatography data for the
failed batch, the reference standard and prior batches may be held in different chromatography data systems, either in-house or with contract manufacturing organisations, so accessing and directly comparing such data is not only time consuming, but can incur significant costs.’ The ultimate aim, Anderson points out, is thus not just to standardise the format of analytical data, but also consider the place where it resides and, importantly, the scope of the data that will inform on the issue. ‘While it may be possible to put the
majority of that data in standardised formats so that current and historical data and metadata can be cross-referenced, we would also advocate the application of standards around characterising the interpretation of that data,’ he said. ‘Chromatographic data analysis will aim to
understand the composition of impurities in your substance. In addition to the absolute peaks in the chromatogram, you will ideally include another layer of data relevant for
working with customers to develop a decision support application that facilitates increased productivity by giving users access to live data across data sources. We’ve gone through the hard work of connecting the different data-generating systems to the decision support systems the project teams use, and embodied this in the decision support platform Luminata.’ Luminata consolidates all critical product
development information in one location. Not so much standardising the data format, as its accessibility, depth of utility and usability, in real time.
What this does mean is that in a GMP
environment, the ability to implement a holistic decision support platform will impact positively on each stage of development or manufacture, but will also require stakeholders from multiple departments to come on board. ‘It may mean a change in standard operating procedures (SOPs) and thus revalidation of existing informatics systems,’ Anderson pointed out. ‘It’s a major shift in the way companies work and think. Each department will have to be engaged, and embrace change.’ ACD/Labs works with companies to help
them to implement this decision support interface, which the firm maintains not only helps reduce the issue of data scavenging, but also results in more seamless productivity, greater insight throughout the production process and troubleshooting, and easier access to batch data for routine or ad hoc analytical assessment.
Illuminating chromatography data Critically, ACD/Labs is aware that for many companies this will mean integrating
@scwmagazine |
www.scientific-computing.com
Dusan Petkovic/
Shutterstock.com
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28