search.noResults

search.searching

note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
Informatics in Southern Europe


original data from accidental or intentional modification, falsification, or even deletion, which is the key to reliable and trustworthy records that will withstand scrutiny during regulatory inspections. Company policies on data governance and the implementation of 21CFR/11 capabilities, available in most informatics tools, should be more than enough to ensure intactness of the data records. However many FDA warning letters are notifications of a lack of implementation of the rules and tools.


Informatics tools can help


Enabling the 21CFR / 11 capabilities that laboratory informatics tools offer today can potentially solve most of these issues. Even so, those capabilities need to be assessed during the selection process in order to ensure that they can be effectively activated in every key process, phase, and step. Te impact of the misuse of these capabilities is of paramount importance: control of data integrity in terms of its accuracy disappears, as does protection against editing, modification, or deletion. All potential traceability of ‘who, when, for what and why’ for the record disappears. Tere have been instances of people


who were supposedly absent accessing the system, thus indicating that they had shared their username and password. Tere are multiple ways to resolve the users’ competences while interfacing with informatics systems, through review


Building a Smart Laboratory 2017


of processes, procedures, and systems. Data integrity is not only about accuracy


and protection; the data should also remain within its original context and include its relationship to other data records. Ensuring the integrity of critical data and metadata is necessary for all computerised laboratory systems. Raw data, electronic records, and metadata depend upon their context within laboratory processes. Data integration is basically mandatory and requires that companies gain a good understanding of the necessary solutions and technical tools used to evaluate the potential level of customisation that providers claim for. During one laboratory informatics


selection process, we prepared an exhaustive request for information document, sent to more than 30 companies, receiving responses from about 20 of them. Despite the fact that a very limited


number have local representation in South Europe, many companies are now emerging with new, cloud-based products. Some have already developed specific relationships with key players in pharmaceutical companies, such as leaders in CRM, document management, and chromatography solutions. Yet the technical solutions provided for the interactions are far from being new and revolutionary. Some companies have the capabilities to develop drivers to interact with a long list of instrumentation; others rely on customisation, coding hours from their technical experts. One of the key issues in


protecting data integrity is related to data integration. In most laboratory data management projects, it is important that the newly implemented systems are capable of interfacing with other existing systems (ERP, MES, EBR, DMS, QMS,


“ There have been


instances of people who were supposedly absent accessing the system, thus indicating that they had shared their username and password ”


CDS, etc.). Developing a solid and reliable integration is key to ensuring a successful audit on IT systems. Tere are obviously a variety of options on how data should be interchanged between systems, starting from a purely manual interface (data manually copied from one system to another) to a fully automated interface (no human interactions). First of all, it is critical that the most reliable and accurate process behind the integration is designed into the project. Secondly, it is critical to evaluate the risks associated with each technical option, develop proper procedures when a manual or semi-automatic integration is designed, and finally solid documentation should support the solution implemented.


Don’t start with the technology


When all these steps are considered and properly implemented, the project may look at the integration of systems just as an additional phase of the implementation, without worries about the implication from a data integrity standpoint. All in all, the technical solutions should be just the last step in a more extended set of activities, which should start from the process definition, deep analysis, risk assessment and implementation. If the implementation projects are


executed according to a ‘top-down’ approach, the technical solutions are simply tools that are intended to resolve a specific step in the process. When projects are designed ‘bottom-up’ (starting from the technical tools), the bigger picture can be missed. Te result is a patchwork rather than a clear and simple picture. n


8 www.scientific-computing.com/BASL2017


Per Bengtsson/Shutterstock.com


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44