Spreadsheet or Excel Validation, There is No Risk!

Many companies don’t like to validate spreadsheets, some say it can’t be done. Is this just a myth? If you’ve got some data to compare or look for missing entries why do it manually? Because a human is more efficient than a computer? Right? Wrong, checking lists is boring – no one wants to do it, that is why there are always mistakes. By qualifying a spreadsheet to do this job you are taking a lesser risk, yes a lesser risk in failing to achieve the correct results consistently.

The computer can check continuously and not make a mistake, always being consistent; even if it was wrong, it would be consistently wrong making it better than a human being. Consistent mistakes can be consistently rectified!

By qualifying a spreadsheet to do this job you are taking a lesser risk, yes a lesser risk in failing to achieve the correct results consistently. The computer can check continuously and not make a mistake, always being consistent; even if it was wrong, it would be consistently wrong making it better than a human being. Consistent mistakes can be consistently rectified!

Evolvement of Computerised Systems

So, what’s the risk in qualifying a spreadsheet? None, some people might not like it [validating spreadsheets] but it’s probably one of the simplist processes to actually validate. In this ever changing industry, one thing is for sure – the use and evolvement of computerised systems and equipment has never been so high. With MES systems becoming more and more popular, more equipment is emerging with the capability to export to a spreadsheet – even software for the likes of scales and pH meters with Ethernet ports or serial ports can facilitate exporting data into spreadsheet formats.

Data is just data

The truth is, by exporting something to a spreadsheet a whole new world of opportunities will appear. Data is just data, plain and simple. It’s pretty useless and almost everything has a load of it. Information on the other hand is: USEFUL DATA; it is informative and valuable. Can you see where this is going? Yes, get your data into a spreadsheet, validate the process, do some ‘magic’ and some validation and hey presto.

The QA department won’t accept this I hear you calling – present this: What level of confidence does their system have for manual entries and checks (that probably aren’t audited)? Do it anyway and demonstrate your efforts. So far, we haven’t failed with our QA departments.

Improve and Enhance

Post validation you will have information that will support your validation activities and your business systems and process in general (including QA) – this will actually provide for a basis of making improvements and enhancements. Once the barriers are down, do it again on another system and before you can say 2011 – you’re exploiting all of your data, becoming a more mature, information-rich site who actually uses their data, sorry information properly and naturally. If you’re as organic as using your immediate spreadsheets as proper tools as you are organic in you’re gardening at home, you’ll be both healthy and informed, as well as efficent and healthy living in the digital age!

So how do you Validate the Spreadsheet?

So, now you’ve bought into this process I hear you asking: “So how do we validate the spreadsheets then?”.

Well, that depends on the level of complexity of the sheet and what record will actually constitute the GxP data. If you have a simple spreadsheet and you are using a few basic formulae to do a calculation or a count, a basic test will suffice and this can hang-off an existing validation plan; an existing SOP or Work Instruction (WI) can detail the operation to facilitate on going consistency.

An example of such usage would be an export from some lab equipment to a spreadsheet to perform basic analysis on the set of data, in this case it would be usual for the master GxP data to reside within LIMS, MES or a Production Batch Record, therefore the basic level of validation would permit the printout to be used to support GxP activities whilst the actual raw data is held in another, already-validated system.

Computer Systems Validation

I have more than basic formulae in my spreadsheet (Yes, I heard you when I was explaining the above). In this case lets assume a VBA macro is used in conjunction with some input data on a spreadsheet. This brings us into the realm of computer systems validation, but don’t be scared! This means that the SDLC must be followed (to an extent). A risk based approach is now required (GAMP5 considers this to be at level 5).

Remember that duplication of effort doesn’t yield many business benefits (if any), nor does quality generally benefit either. Most organisations facilitate the scaling of the suite of validation documents in order to suit the job-in-hand. Would it really be beneficial to have a URS, FDS, Test Plan, etc PLUS all of the associated reports for a small(ish) validation effort when all of the same people are required to sign-off anyway?

Create a New Specification

At this juncture I would recommend creating a new specification to outline the user, business and functional requirements to the spreadsheet validation effort, in addition, update the validation plan for the originating functional area or project to include the spreadsheet validation and outline the test plan and qualification deliverables (e.g. IQ/OQ – these can be scaled into a single IOQ document) and make reference to the SOP or WI that will control the operation of the spreadsheet. Be sure to include backward and forward traceability between all of the documents (e.g. the specification should reference the VP and vice-versa).

Conclusion

I’ve written this article, and reading back through it makes me sound like I’m making this subject-matter sound very simple. To an extent it is, but this is an overview and the level of ease depends on the source data, what you’re going to do with it, the support of your manager and hopefully a forward-thinking QA department. The basic steps are:

1) Identify the GxP data – is this going to be the spreadsheet or another system (manual or electronic, it doesn’t matter).

2) Take a risk-based approach to ascertain the level of complexity of the sheet. This will yield several outcomes ranging from full-validation, source code review, and new documents to a few simple tweaks and a test hanging-off an existing structure.

3) Documents. Don’t forget that these can be scaled to reflect the size of the job; there is no reason why this can’t involve the amalgamation of documents to better support the effort.

And for my last trick…

And finally, we’re in a position where the SDLC has been fulfilled, the spreadsheet has been designed, implemented, tested – it is validated and the validation is managed via the VP, your colleagues haven’t been mad because they had to sign a million documents when you only needed a couple of new ones and a couple of updates to existing documents.

What next, execute the tests write, issue and approve the reports then you’re in compliance and ready to use these spreadsheets in GxP activites.

Whilst performing this initial spreadsheet validation effort a model has been created that will permit you to keep on validating different spreadsheet applications with various data sources, the same rules apply, but as with everything else – the more you do, the easier it gets.

Author

Mark Richardson

Technical Expert Kestrel Life Sciences