Read Part 1 First

To read Part 1 of this great article click here.

Failure Mode and Effects Analysis (FEMA)

Failure Mode and Effects Analysis (FEMA) is an inductive reasoning risk assessment tool that considers risk as a product of the following components:

  • Severity of the consequences of a potential failure (S)
  • Likelihood of a potential failure occurring (O)
  • The probability of a failure not being detected (D)

The risk assessment process consists of:

Assigning a risk level (High, Medium or Low) to each of the above risk components; with a detailed knowledge and experience of the principles and functioning of the instrument being qualified, it is possible to objectively assign risk levels to both the likelihood of a failure occurring and the probability of a failure not being detected. The likelihood of a failure occurring can be considered in terms of the time interval between the occurrence of the same failure.

Assigning risk levels to the probability of not detecting a failure requires knowledge of a how a failure of a particular instrument function will be manifested. For example, if a failure in the instrument operating software was to mean the spectrophotometer could not be operated. This would be readily detected and therefore a low risk level would be assigned. Conversely an inaccuracy in the absorbance measurement would not be readily detected unless a calibration was made and thus, a failure in the absorbance measurement functionality of a spectrophotometer would be assigned a high non-detection risk level.

The assignment of severity risk level is somewhat more subjective and would also depend to some degree on the requirements of the respective laboratory. One approach is to consider the severity risk level as a sum of the:

Some suggested criteria for the assignment of risk levels for all the components of the overall risk assessment discussed above are presented in the table 2. The suggested criteria are most suitable for use in a regulated product QC environment. Other applications of laboratory analysis may require different set of assignment criteria. For example in a forensic laboratory the quality exposure would be related to the affects a failure could have on the outcome of a criminal trial.

Calculating [5] the overall risk level entails:

1. Assigning a numerical value to each severity risk level for each of the different severity categories as shown in Table 3

2. Summing the numerical values of the severity levels for each of the risk categories will yield a numerical total severity level of between 3 and 9

3. The numerical total severity level can be converted into qualitative total severity level as shown in Table 4

4. Multiplying the qualitative total Severity (S) level by the probability of Occurrence (O) level will yield a Risk Class as shown in Table 5.

5. The Risk Factor can then be calculated by multiplying the Risk Class by the non-detectability as shown in Table 6.

A valuable feature of this approach is that the calculation gives additional weight to the occurrence and detectability factors in the calculation of the Risk Factor. For example: In a scenario where a failure has a high level of severity, but is unlikely to occur and is also readily detectable the overall risk factor is low. Conversely, in a scenario where the potential severity is low, but the failure is likely to occur frequently and is not easily detectable the overall risk factor is high.

Thus, severity, which is often difficult, or even impossible to mitigate will not control the overall risk associated with a particular functional failure. Whereas occurrence and non-detectability which can be more easily mitigated exert more control on the overall risk.

Discussion

The risk assessment process consists of four principle steps as listed below:

1. Performing an assessment in the absence of any mitigating controls or procedures

2. Based on the findings of this assessment, establish controls and procedures designed to mitigate the assessed risk

3. Perform a post-mitigation risk assessment to determine the effectiveness of the mitigation

4. If necessary establish additional mitigating controls and procedures and re-assess

The risk assessment summarised in Table 7 and discussed below has been considered from the perspective of the pharmaceutical and allied industries. However the same processes can be applied to any other sector, but if different priorities apply, different but no less valid conclusions may be reached.

Initial Assessment

Starting with the operational functions of the spectrophotometer, the wavelength accuracy and precision, and the spectral resolution power of the spectrophotometer determine its ability to perform identity testing of a product based on its UV/visible spectrum. Any inaccuracies or lack of precision of wavelength determination or deficiencies in the resolving power of a spectrophotometer could result in erroneous identification.

This could lead to misidentified products being released and reaching the consumer. This could lead to the need to recall products resulting in major cost or loss of revenue. Thus, this would represent a high level of risk in each of the severity categories.

Stray light affects the accuracy of the absorbance measurements. Modern instruments can make an allowance for this, but require the stray light to be determined and stored in the spectrophotometer’s operating software. Any inaccuracies in the stored stray light parameters will result in inaccurate absorbance measurements, with the same consequences for photometric stability, noise and accuracy, and baseline flatness as discussed in the next paragraph. Consequently this would represent a high level of risk in each of the severity categories. Wavelength accuracy and precision, resolving power and stray light are largely dependent on the optical properties of the spectrophotometer. Modern diode array

instruments have no moving parts and consequently are assigned a medium likelihood of occurrence. However, in the absence of specific checks a failure of these functions are unlikely to be detected and thus, a high risk level is assigned for non-detectability.

Photometric stability, noise and accuracy, and baseline flatness will affect the accuracy of the measured absorbance. If the spectrophotometer is being used to make quantitative measurements, any error in the measured absorbance could result in erroneous results being reported. If the reported results, from these measurements, are used to release a batch of pharmaceutical product to the market this could result in substandard batches of product reaching the consumer.

These batches would have to be recalled resulting in major cost or loss of revenue. Thus, this would represent a high risk in each of the severity categories. These functions are dependent on the quality of the UV lamp. UV lamps have a typical life of approximately 1500 hours or approximately 9 weeks of continuous use. Thus, this represents high risk of occurrence. Furthermore, in the absence of any precautions a failure of any of these functions is unlikely to be detected, thus representing a high non-detectability factor.

Turning to the data quality and integrity functions, as the test results are being used to make decisions regarding the suitability of pharmaceutical products for their intended purpose. Any compromise in the accuracy or integrity of the records created could potentially result in products of uncertain quality being released to the market, which could cause harm to the consumer and products might have to be recalled, resulting in major cost to the laboratory/company. This therefore represents a high level of risk in each of the severity categories. However, once the necessary software configurations have been successfully established a functional failure is unlikely to occur. Furthermore, any failure would be readily detected.

For example:

  • Restricting access to authorised persons is achieved by the system requesting input of a username and a password prior to opening the relevant operating program. If this function failed the system would no longer request the input of a username and password and hence would be readily detectable. Therefore, there is low risk of non-detection.
  • When a file is created that needs electronically signing, a dialogue box is opened that requires entry of a username and password, if this system was to fail, the dialogue box would not open and hence the failure would be detected.

Migration

Although it is not possible to mitigate the severity of a failure of the operational functions it is, however, possible to significantly reduce the likelihood of a failure taking place and increase the probability of detecting a failure. It is recommended to qualify the following functions before initial use:

  • wavelength accuracy and precision
  • spectral resolution
  • stray light
  • photometric accuracy, stability and noise
  • spectral base line flatness

and then requalify at appropriate intervals, as this will significantly reduce the likelihood of failure and also the probability of any failure going undetected. As the photometric stability, noise and accuracy, and baseline flatness are all dependent upon the condition of the UV lamp, and typical deuterium lamps have a life time of approximately 1500 hours (9 weeks) of continuous use, it is recommended that as part of the operating procedure the lamp(s) should be turned off when the spectrophotometer is not in use. It is also recommended to perform a preventative maintenance (PM), including lamp replacement, and re-qualification (RQ) every six months.

The rationale for this re-qualification period is based on the life time of a typical UV lamp. This is approximately 185 eight hour days, and corresponds to number of weeks listed in Table 8. Thus, if the spectrophotometer is used for four or five days a week the UV lamp would last for about eight to ten months.

Performing a six monthly preventative maintenance and requalification (PM/RQ) allows for a margin of safety. If the spectrophotometer is used for six or seven days a week the lamp would be expected to last for about six months and a three monthly PM/RQ would be more appropriate to incorporate a suitable safety margin. Conversely if the spectrophotometer is used once or twice a week a twelve monthly PM/RQ would be acceptable.

In addition, due to the relatively short life of the deuterium lamp it is recommended to check the following, each day the spectrophotometer is used as this will provide additional assurance of correct functioning of the spectrophotometer:

  • Lamp intensity
  • Dark current
  • Calibration of the deuterium emission lines at 486 and 656.1 nm
  • Filter and shutter speed
  • Photometric noise
  • Spectra baseline flatness
  • Short term photometric noise

Modern instruments have these tests already set up and can be performed by selecting the appropriate function. If any of the tests fail, with the exception of the dark current and filter and shutter speed test, the deuterium lamp should be replaced. If the either the dark current or filter and shutter speed tests fail the spectrophotometer should be taken out of service and repaired and re-qualified. Establishing these procedures will minimise the risk of both the likelihood of an operating function failing and any failure not being detected.

The Risk Factors for the quality and data integrity functions are already low without any mitigation. Therefore the performance of these functions need only be checked during the OQ and PQ to confirm correct configuration. After which any failure would be readily detectable. However, staff should be given appropriate training or instructions to recognise a failure and the appropriate actions to take.

Conclusion

Failure Mode Effects and Analysis (FEMA) is an easy to use risk assessment tool, which can be readily applied to assessing the quality, compliance and business risks associated with functional failure of laboratory instrumentation. Performing such risk assessments allows for well informed decisions to be made regarding establishing appropriate controls and procedures to economically manage the risks associated with the failure of critical instrument functions.

References

  • [1] OECD Priciples Of Good Laboratory Practice, Organisation for Economic Co-operation and Development, 1997.
  • [2] Guide to Good Manufacturing Products, Geneva: PIC/S, 2009.
  • [3] ISO/IEC 17025 General Requirements for the Compitance of Testing and Calibration Laboratories, Geneva: International Standards Organisation, 2005.
  • [4] “Failure Mode and Effects Analysis,” wikipedia, [Online]. Available: http://en.wikipedia.org/wiki/Failure_mode_and_effects_analysis. [Accessed 16 Feb 2014].
  • [5] D. Trew, An Easy to Understand Guide to HPLC Validation, Premier Validation, 2013.