9 Steps to Overcoming Data Integrity Challenges in the Laboratory

Being a Laboratory Manager in a regulated environment (FDA, CLIA, CAP etc.) can be stressful for a number of reasons not the least of which is the increasing scrutiny by regulators around data integrity.

While we can all agree that data integrity is of significant importance in the laboratory setting, the process by which this is achieved and maintained continues to be the subject of much debate. Without a doubt this mission is daunting in the face of demanding daily operations and evolving interpretation of regulations

Today’s Laboratory Environment

Spreadsheets are being created, results are being processed in open source systems, data is being stored in a cloud somewhere, systems are being upgraded and no one is 100% clear whether they rely on paper and or electronic records…or both!

What is the primary, ‘official’ record?

How does Part 11 apply?

Where do we start?

Clients ask AZZUR IT these questions all the time. While there is no standard answer, I have provided some basic steps that should help you get started.

For years, Laboratory Managers have faced multi-faceted challenges in keeping their laboratories compliant with Part 11/ predicate rules including:

  • Rogue spreadsheets
  • Data stored locally/ not backed up
  • Maintaining validated state of software/ instruments
  • New software introduced to process data but not validated or controlled
  • Maintaining user access privileges

Take A Deep Breath

This does not have to be stressful! There IS a way to manage the chaos and sleep better at night. By using a common sense, methodical approach and spending bit of time spent up front to identify compliance issues, you can evaluate (and improve where needed) data integrity controls and have increased confidence going into the next inspection.

By following these 9 steps you will be able to identify and implement controls necessary to protect data and pass your next inspection:

  1. Create a software inventory- you must know what you have in order to control it.
  2. Determine criticality
  3. Map the data flow through various systems and networks
  4. Review SOPs for alignment with the data flow. Are employees relying on paper or electronic records? Are all tools defined?
  5. Review Part 11 applicability?
  6. Review controls/ validation status.
  7. Ownership – confirm all parties understand responsibilities.
  8. Determine what new controls are necessary.
  9. Track to closure, periodically monitor.

Consider This Example

A simple, but typical data flow:

  1. A raw data file is created by the lab instrument
  2. The file is saved to a network drive
  3. File is imported into a spreadsheet for processing
  4. Spreadsheet with data saved locally
  5. Spreadsheet uploaded to an FTP site
  6. File is retrieved and imported into a statistical program for further processing
  7. Report printed

It’s Better to Uncover Issues Now

As you can see from the above example, many questions could be asked…is all of the software validated and under change control?; is network drive secure and controlled?; is FTP site secure; are raw data files and audit trails available for review by QC?; are files backed up routinely?, etc.

I’m sure that you can think of a few more! Identifying the software and data flow is a key step in getting this process started. It’s certainly better to gain an understanding of the situation now rather than later when auditors are asking questions.

Data Owners Must Be Accountable and Communicate

In this example you may have several data owners and it is essential that each owner is aware of their responsibility for data integrity controls, including IT. Many times data integrity gaps exist where communication breaks down between different departments responsible for systems that hold regulated data.

Take The First Step

We have discussed some of the data integrity challenges facing Laboratory Managers today and listed general steps to get your arms around the situation.

A final though, in order to holistically and successfully address data integrity concerns we must shift our focus from the current mindset of validation of individual components in a data flow to understanding the entire data lifecycle and assuring controls are appropriately placed. Follow the checklist above to get started today.

AZZUR IT is happy to help if you need more advice (just email askaboutIT@azzur.com).

Author

Doug Shaw

Director of IT & CSV Consulting at Azzur IT