QC Laboratory data integrity discussions at NHS Symposium

September 23, 2015 / by Paul Moran

NHS Symposium

It was wonderful to see such a great turnout at the UK NHS Pharmaceutical Quality Assurance and Technical Services Symposium 2015.  The Symposium included several informative break-out sessions to discuss current key topics facing the industry.  Dr. Julian Smith (Viridian Pharma Ltd) and I had the opportunity to host one of the sessions to cover ICH analytical method validation and data integrity.

Companies within the Life Sciences industry have seen a significant re-focus on data integrity recently.  On 17th March 2015 the Medicines and Healthcare Products Regulatory Agency (MHRA) published the updated GMP pre-inspection compliance report templates and guidance.  The new format now includes Data Integrity aspects which includes:

  • Confirmation of a policy on data integrity / data governance.

  • Confirmation that computerised system owners / personnel with administrative-level permissions will be made available during the GMP inspection.

  • Information on computerised systems used for storage, control and processing. The MHRA make specific reference to Laboratory Information Management Systems (LIMS) used within Quality Control laboratories and manufacturing execution systems.

  • Listing all principal computerised systems such as LIMS, ERP and CDS. They also request information on stand-alone systems and qualifications dates. 

Several of the attendees are expecting an MHRA inspection of their QC Laboratories within the next 12 months and no doubt will be quizzed on both method validation and data integrity.

The MHRA have published the ALCOA principles which are a useful tool to help with your data integrity audit preparations:

         A = Attributable

         L = Legible and permanent

         C = Contemporaneous

         O = Original

         A = Accurate


The identity of the person completing a record should be unambiguous. The use of aliases or abridged names should only be permitted where this is consistently used, and attributable to an individual. The same alias or IT system log-in which cannot differentiate between different individuals should not be used.

Legible (permanent):

It should not be possible to modify or recreate data without an audit trail which preserves the original record. It is important not to forget paper records in this context. Blank forms for manual recording of data should also be controlled in a manner which prevents unauthorised re-creation.

It is important not to forget paper records. 

Exceptionally, there may be a valid reason to re-create a record, eg. where it has been damaged beyond use, or where an error does not enable a GMP compliant correction of the original. This must be managed through the quality system, either by making a ‘true copy’ (verified as being a true replicate of the original), or by re-writing a new copy and retaining the original as evidence. In all cases, this must be approved through the quality system, with QA oversight and justification for the action.

It is generally accepted that correction fluid is not acceptable in GMP areas.

However, companies may be unaware that their computerised systems often have ‘data annotation tools’ enabled. These permit changes to data which can alter the appearance of reports, and may not have a visible audit trail. From a practical perspective, this is ‘electronic correction fluid’, and should not be permitted.


System design has significant impact upon contemporaneous record keeping. The availability of records in the right place at the right time removes the need for staff to use loose scraps of paper, or their memory, to retain information for retrospective completion in the official record.

When inspecting packaging operations, I still find it a common approach for manufacturers to use a single batch packaging record (BPR) for blistering and cartoning of a solid dosage form. However, if the BPR is located in the secondary packing area, it is impossible for staff in the primary packing area to make contemporaneous records, and vice versa. The BPR may also require periodic checks, such as equipment performance. Specifying exact time intervals (eg. ‘every 60 minutes’) may result in an incentive for staff to ‘back date’ the time of the check if they were occupied at the exact time the activity was required. The system is encouraging staff to falsify the record, particularly if there is concern that missing an exact time point might lead to disciplinary measures.

Splitting the BPR into 2 parts (primary and secondary) encourages the correct behavior.

This can be addressed by 2 simple changes. Specifying an acceptable window for completion of the activity (eg.‘every 60 ±5 minutes’), and splitting the BPR into 2 parts (primary and secondary) encourages the correct behaviour, and removes both opportunity and incentive to falsify the record.


Original records must preserve data accuracy, completeness, content and meaning. Metadata (data about data) is vital in this aim by enabling reconstruction of an activity – who did what, where and when. There are certain limitations in relation to file formats which may not maintain the full metadata record; so-called ‘flat files’ such as .pdf, .doc etc. We may know who created the file, and when, but there may be no information on how, when or by whom the data presented in that document was created, processed or amended. There is therefore an inherently greater data integrity risk with flat files, as they are easier to manipulate and delete as a single record with limited opportunity for detection. 


Automated data capture, with the required IT controls, provides greater control over the accuracy of a record. Where automation is not possible or feasible, real-time second operator verification of quality-critical observed values may be necessary. 

Data review must include a review of raw data in its original form. If access to electronic raw data is not possible remotely, this is a good opportunity for the reviewer to escape the confines of their office. Reviewing paper copies or flat file reports of electronic data, even from a validated secure system, is unlikely to enable detection of anomalies. This is because the preparation of reports still requires operator intervention, which can influence what data is reported, and how it is presented.

creating a culture of quality for data integrity in a business

Topics: Insider, Conferences, LIMS, Data Management, News, Data Integrity

Paul Moran

Written by Paul Moran

Dr. Paul Moran is the founder of Broughton Laboratories Ltd and Broughton Software Ltd who serves as their Chief Executive. As a chemistry graduate with a PhD in Biotechnology, Paul began his career at the US pharmaceutical manufacturer Johnson and Johnson. In his role as QC Laboratories Manager, Paul obtained Six Sigma Black Belt certification. Paul entered the world of contract QC testing in 2003 supporting the successful growth of a contract laboratory leading to its merger in 2005. Paul started his first venture in 2006, establishing Broughton Laboratories as one of the leading UK MHRA and US FDA GMP licensed contract laboratories with its own dedicated stability storage facility which opened in 2011. The spin-off company, Broughton Software, was established in 2012 to provide a LIMS solution for regulated QC Laboratories. LabHQ LIMS launched its 4th release in September 2015.