Out of Specification Investigation Phase II & III (MHRA)

Out of Specification Investigation Phase II (Unknown Cause / No Assignable  Cause ) & Phase III

Phase II Investigation  – Unknown Cause / No Assignable  Cause

These are difficult to perform as the result can be 1 to 2 weeks after the analysis was performed and maybe weeks after the batch was manufactured. 

It is important to evaluate the test conditions carefully and determine what the boundary of samples/products/manufacturing area is.  you do not determine the boundary of the suspect results it is difficult to determine if it one or more batches impacted.

The laboratory and manufacturing investigations need to be in depth. The investigations should clearly state the hypothesis and who will be responsible for the identified tasks.

Are the organisms of an expected type, determine likely source – would it be likely to be found where it was?

Review the media – prepared in house or bought in pre-prepared, supplier history, sterilization history

Equipment/utilities used – validation, maintenance, and cleaning status. Evaluate area/environmental trends for test area and support areas.

Cleaning and maintenance of the test environment

Disinfectant used

Use appropriate root cause analysis to help brainstorm all possibilities It is likely that there may be more than one root cause

Review decisions and actions are taken in light of any new information.

Due to the variability of microbiological results don’t limit the investigation to the specific batch it should be broader to review historical results and trends

Unusual events should be included to understand potential impacts. What is the justification to perform a repeat analysis (is sample left); re-test or resample

Any identifications may need to be at DNA/RNA level (bioburden failures)

All potential sources of contamination need to be considered – process flow the issue from sample storage to the test environment.

Use scientific decisions/justifications and risk based analysis.

The investigation may include working closely with the manufacturing team

During the investigation, it is an advantage to go and look at where the contamination occurred.

Ask how relevant plant is cleaned, tested for integrity, checked for wear, checked for material suitability and maintained at the occurrence site may reveal possible causes.

Where possible talk directly to the staff involved as some information may be missed if not looked at from the chemist/ microbiologist point of view.

Look for other documentation such as deviations and engineering notifications around the area of concern (this is applicable to the laboratory as well as manufacturing).

Trending can have species drift which may also be worthy of an action limit style investigation.

Statistical analysis for microbiology can include lots of zero results so recovery rates or similar may have to be used.

If a sample is invalidated the remaining level of assurance needs to be carefully considered, is their sufficient residual information?   Corrective actions may be appropriate for more than one root cause.

Stability – OOS/OOT:

Stability OOS/OOT situations should be escalated as soon as the suspect result is found.  Follow the investigation as above for Phase I and Phase II.  For OOS Situations Regulatory agencies will require notification within a short time point of discovery due to recall potential.

If abnormal results are found at any stability interval which predicts that the test results may be OOS before the next testing interval, schedule additional testing before the next scheduled testing interval.  This will help better determine appropriate actions to be taken.

The stability OOS should link to the Product Recall procedures.

OOT

To facilitate the prompt identification of potential issues, and to ensure data quality, it is advantageous to use objective (often statistical) methods that detect potential out-of-trend (OOT) stability data quickly.

OOT alerts can be classified into three categories to help identify the appropriate depth for an investigation. OOT stability alerts can be referred to as:

–analytical,

–process control, and

–compliance alerts,

As the alert level increases from analytical to process control to compliance alert, the depth of investigation should increase.

Stability:

A compliance alert defines a case in which an OOT result suggests the potential or likelihood for OOS results to occur before the expiration date within the same stability study (or for other studies) on the same product.

The stability OOS should link to the Product Recall procedures.

Historical data are needed to identify OOT alerts.

An analytical alert is observed when a single result is aberrant but within specification limits (i.e., outside normal analytical or sampling variation and normal change over time).

If the batch is rejected there still needs to be an investigation.

To determine:

– if other batches or products are affected.

  – identification and implementation of corrective and preventative action.

Phase III Investigation

The phase 3 investigation should review the completed manufacturing investigation and combined laboratory investigation into the suspect analytical results, and/or method validation for possible causes into the results obtained.

To conclude the investigation all of the results must be evaluated.

The investigation report should contain a summary of the investigations performed; and a detailed conclusion.

For microbiological investigations ,where appropriate, use risk analysis tools to support the decisions taken and conclusions drawn.  It may not have been possible to determine the actual root cause therefore a robust most probable root cause may have to be given.

The batch quality must be determined and disposition decision taken.

Once a batch has been rejected there is no limit to further testing to determine the cause of failure, so that corrective action can be taken.

The decision to reject cannot be reversed as a result of further testing.

The impact of OOS result on other batches, ongoing stability studies, validated processes and testing procedures should be determined by Quality Control and Quality Assurance and be documented in the conclusion, along with appropriate corrective and preventive actions.

Batch Disposition

Conclusion:

If no laboratory or calculation errors are identified in Phase I and Phase II there is no scientific basis for invalidating initial OOS results in favor of passing retest results. All test results, both passing and suspect, should be reported (in all QC documents and any Certificates of Analysis) and all data has to be considered in batch release decisions.

If the investigation determines that the initial sampling method was inherently inadequate, a new accurate sampling method must be developed, documented, and reviewed and approved by the Quality Assurance responsible for the release.  Consideration should be given to other lots sampled by the same method.

An initial OOS result does not necessarily mean the subject batch fails and must be rejected. The OOS result should be investigated, and the findings of the investigation, including retest results, should be interpreted to evaluate the batch and reach a decision regarding release or rejection which should be fully documented.

In those cases where the investigation indicates an OOS result is caused by a factor affecting the batch quality (i.e., an OOS result is confirmed), the result should be used in evaluating the quality of the batch or lot. A confirmed OOS result indicates that the batch does not meet established standards or specifications and should result in the batch’s rejection and proper disposition. Other lots should be reviewed to assess impact.

For inconclusive investigations — in cases where an investigation:-

   (1) does not reveal a cause for the OOS test result and

   (2) does not confirm the OOS result

the OOS result should be given full consideration (most probable cause determined) in the batch or lot disposition decision by the certifying QP and the potential for a batch-specific variation also needs considering.

Any decision to release a batch, in spite of an initial OOS result that has not been invalidated, should come only after a full investigation has shown that the OOS result does not reflect the quality of the batch. In making such a decision, Quality Assurance/QP should always err on the side of caution

Reference :- MHRA OOS/OOT PPT

Out Of Specification Investigation Phase Ia & Phase Ib (MHRA)

Out Of Specification Investigation Phase Ia & Phase Ib (MHRA)

Phase Ia Investigation

Definition:

Out of Specification Investigation Phase la investigation is to determine whether there has been a clear obvious errors due to external circumstances such as power failure or those that the analyst has detected prior to generating data such as spilling sample that will negate the requirement of a Phase Ib investigation.

For microbiological analysis this may be after the analysis has been completed and reviewed during reading of the samples.

It is expected that these issues are trended even if a laboratory investigation lb or ll was not raised.

Phase la Investigation – Obvious Error

Examples

Calculation error – 

analyst and supervisor to review, both initial and date correction.

Power outage –

analyst and supervisor document the event, annotate “power failure; analysis to be repeated” on all associated analytical documentation.

Equipment failure  –

analyst and supervisor document the event, annotate “equipment failure; analysis to be repeated” cross reference the maintenance record.

Testing errors –

for example, spilling of the sample solution, incomplete transfer of a sample; the analyst must document immediately.

for microbiology it could be growth on a plate not in the test sample area, negative or positive controls failing.

Incorrect Instrument Parameters –

for example setting the detector at the wrong wavelength, analyst and supervisor document the event, annotate “incorrect instrument parameter”; analysis to be repeated” on all associated analytical documentation .

If no error was noted, and none of the above conditions were met Phase Ib investigation must take place.

Phase Ib Investigation

Specification –

A specification is defined as a list of tests, references to analytical procedures, and appropriate acceptance criteria which are numerical limits, ranges, or other criteria for the tests described.  It establishes the set of criteria to which a drug substance, drug product or materials at other stages of its manufacture should conform to be considered acceptable for its intended use.  “Conformance to specification” means that the drug substance and drug product, when tested according to the listed analytical procedures, will meet the acceptance criteria. Specifications are critical quality standards that are proposed and justified by the manufacturer and approved by regulatory authorities as conditions of approval.

Regulatory Approved Specification

Specifications for release testing.  If no release specifications have been established then the internal specification becomes the release specification.

Acceptance Criteria –

Numerical limits, ranges, or other suitable measures for acceptance of the results of analytical procedures which the drug substance or drug product or materials at other stages of their manufacture should meet.

Internal Specification –

Are also action limits within regulatory specifications.

Phase Ib Investigation – Definitions

Assignable Cause –

An identified reason for obtaining an OOS or aberrant/anomalous result.

No Assignable Cause  –

  When no reason could be identified.

Invalidated test –

  A test is considered invalid when the investigation has determined the   assignable cause. •

Reportable result –

Is the final analytical result. This result is appropriately defined in the written approved test method and derived from one full execution of that method, starting from the original sample.

Warning Level or Trend excursions

If two or more consecutive samples exceed warning (alert), or if an increasing level of counts, or same organisms identified, over a short period was identified consideration should be given to treat the results as action level excursions.

Hypothesis/Investigative Testing

  Is testing performed to help confirm or discount a possible root cause i.e what might have happened that can be tested:- for example it may include further testing regarding sample filtration, sonication /extraction; and potential equipment failures etc. Multiple hypothesis can be explored.

Investigation by Analyst and Supervisor

Phase Ib Investigation – Initial Investigation conducted by the analyst and supervisor using the Laboratory Investigation Checklist

Contact Production/Contract Giver/QP/MAH as appropriate

For microbiological analysis where possible once a suspect result has been identified ensure all items related to the test failure are retained such as other environmental plates, dilutions, ampoules/vials of product, temperature data, auto-pipettes, reagents – growth media.  No implicated test environmental plates should be destroyed until the investigation has been completed.

The Analyst and Supervisor investigation should be restricted to data / equipment / analysis review only

On completion of the Analyst and Supervisor investigation re-measurement can start once the hypothesis plan is documented and is only to support the investigation testing.

This initial hypothesis testing can include the original working stock solutions but should not include another preparation from the original sample (see: re-testing)

The checklist may not be all-inclusive, but should be a good guideline to cover the pertinent areas that need to be covered in any laboratory investigation:-

-Correct test methodology followed e.g.. Version number.

-Correct sample(s) taken/tested (check labels was it taken from a correct place).

Sample Integrity maintained, correct container and chain of custody (was there an unusual event or problem).

How were sample containers stored prior to use

Correct sampling procedure followed e.g. version number

Assessment of the possibility that the sample contamination has occurred during the testing/ re-testing procedure (e.g. sample left open to the air or unattended).

All equipment used in the testing is within calibration date.

-Review equipment log books. -Appropriate standards used in the analysis. -Standard(s) and/or control(s) performed as expected.

System suitability conditions met (those before analysis and during analysis).

-Correct and clean glassware used.

-Correct pipette / volumetric flasks volumes used. -Correct specification applied.

-Media/Reagents prepared according to procedure.

•Items were within expiry date

•A visual examination (solid and solution) reveals normal or abnormal appearance

Data acceptance criteria met

-The analyst is trained on the method.

Interview analyst to assess knowledge of the correct procedure.

Examination of the raw data, including chromatograms and spectra; any anomalous or suspect peaks or data.

Any previous issues with this assay.

-Other potentially interfering testing/activities occurring at the time of the test.

-Any issues with environmental temperature/humidity within the area whilst the test was conducted.

-Review of other data for other batches performed within the same analysis set.

Consideration of any other OOS results obtained on the batch of material under test.

Assessment of method validation.

Additional considerations for microbiological analysis:

-Are the isolates located as expected

on glove dab marks, SAS ‘dimples’, filter membrane, etc.

-Was the sample media  integral

i.e. no cracks in plates.

Was there contamination present in other tests (or related tests) performed at the same time, including environmental controls.

-Were negative and positive controls satisfactory.

Were the correct media/reagents used?

Were the samples integral (not leaking)

Were the samples stored correctly (refrigerated)

Were the samples held for the correct time before used for the test?

Was the media/reagent stored correctly before use

Were the incubation conditions satisfactory?

Take photographs to document the samples at the time of reading (include plates, gram stains and anything else that may be relevant).

If still investigation is not closed then investigation should go in Phase II

Reference:- MHRA OOS & OOT PPT.

Out of Specification Investigation Phase II (MHRA).

Out Of Specification & Out Of Trend Investigation (MHRA).

Out of Specification &Out of Trend Investigations (MHRA)

Flow chart of OOS & OOT Investigations


Out of Specification &Out of Trend Investigations (MHRA)

Laboratory Analysis

Investigations of “Out of Specification (OOS) / Out of Trend (OOT)/ Atypical results” have to be done in cases of:

  • Batch release testing and testing of starting materials.
    • In-Process Control testing: if data is used for batch calculations/decisions and if in a dossier and on Certificates of Analysis.
    • Stability studies on marketed batches of finished products and or active pharmaceutical ingredients, on-going / follow up stability (no stress tests)
    • Previous released batch used as reference sample in an OOS investigation showing OOS or suspect results.
    • Batches for clinical trials.

All solutions and reagents should be retained until all data has been second person verified as being within the defined acceptance criteria.

Pharmacopoeia have specific criteria for additional analyses of specific tests (i.e. dissolution level specification for S1, S2 & S3 testing; Uniformity of dosage units specification for testing of 20 additional units; Sterility Testing). 

However, if the sample test criteria are usually the first level of testing and a sample has to be tested to the next level this should be investigated as it is not following the normal trend.

The OOS process is not applicable for In-process testing while trying to achieve a  manufacturing process end-point i.e. adjustment of the manufacturing process. (e.g. pH, viscosity), and for studies conducted at variable parameters to check the impact of drift (e.g. process validation at variable parameters).

     Out-of-Specification (OOS) Result –

  • Test result that does not comply with the pre-determined acceptance criteria (i.e. for example, filed applications, drug master files, approved marketing submissions, or official compendia or internal acceptance criteria). 
    • Test results that fall outside of established acceptance criteria which have been established in official compendia and/or by company documentation (i.e., Raw Material Specifications, In-Process/Final Product Testing, etc.).

Out of Trend (OOT) Result –

  • Is generally a stability result that does not follow the expected trend, either in comparison with other stability batches or with respect to previous results collected during a stability study. However the trends of starting materials and in-process samples may also yield out of trend data.
    • The result is not necessarily OOS but does not look like a typical data point.
    • Should be considered for environmental trend analysis such as for viable and non viable data (action limit or warning limit trends)

Atypical / Aberrant / Anomalous Result –

  • Results that are still within specification but are unexpected, questionable, irregular, deviant or abnormal.  Examples would be chromatograms that show unexpected peaks, unexpected results for stability test point, etc.

Reference :- This is a guidance document that details MHRA expectations

What is Data Integrity In Pharmaceutical Industry…???

What is Data Integrity In Pharmaceutical Industry…???

MHRA says,”The way regulatory data is generated has continued to evolve in line with the ongoing development of supporting technologies such as the increasing use of electronic data capture, automation of systems and use of remote technologies; and the increased complexity of supply chains and ways of working, for example, via third party service providers. Systems to support these ways of working can range from manual processes with paper records to the use of fully computerized systems. The main purpose of the regulatory requirements remains the same, i.e. having confidence in the quality and the integrity of the data generated (to ensure patient safety and quality of products) and being able to reconstruct activities (Data Integrity).

AS per USFDA

What is “data integrity”?

For the purposes of this guidance, data integrity refers to the completeness, consistency, and accuracy of data. Complete, consistent, and accurate data should  be attributable, legible, contemporaneously recorded, original or a true copy, and accurate (ALCOA).

 

As per MHRA

What is “data integrity”?

Data integrity is the degree to which data are complete, consistent, accurate, trustworthy, reliable and that these characteristics of the data are maintained throughout the data life cycle. The data should be collected and maintained in a secure manner, so that they are attributable, legible, contemporaneously recorded, original (or a true copy) and accurate. Assuring data integrity requires appropriate quality and risk management systems, including adherence to sound scientific principles and good documentation practices.

MHRA Defined some principles of Data Integrity as given Below;

  • The organisation needs to take responsibility for the systems used and the data they generate. The organisational culture should ensure data is complete, consistent and accurate in all its forms, i.e. paper and electronic.
  •   Arrangements within an organisation with respect to people, systems and facilities should be designed, operated and, where appropriate, adapted to support a suitable working environment, i.e. creating the right environment to enable data integrity controls to be effective.
  •  The impact of organisational culture, the behaviour driven by performance indicators, objectives and senior management behaviour on the success of data governance measures should not be underestimated. The data governance policy (or equivalent) should be endorsed at the highest levels of the organisation.
  • Organisations are expected to implement, design and operate a documented system that provides an acceptable state of control based on the data integrity risk with supporting rationale. An example of a suitable approach is to perform a data integrity risk assessment (DIRA) where the processes that produce data or where data is obtained are mapped out and each of the formats and their controls are identified and the data criticality and inherent risks documented.
  • Organisations are not expected to implement a forensic approach to data checking on a routine basis. Systems should maintain appropriate levels of control whilst wider data governance measures should ensure that periodic audits can detect opportunities for data integrity failures within the organisation’s systems.
  • The effort and resource applied to assure the integrity of the data should be commensurate with the risk and impact of a data integrity failure to the patient or environment. Collectively these arrangements fulfil the concept of data governance.
  •  Organisations should be aware that reverting from automated or computerised systems to paper-based manual systems or vice-versa will not in itself remove the need for appropriate data integrity controls.
  • Where data integrity weaknesses are identified, companies should ensure that appropriate corrective and preventive actions are implemented across all relevant activities and systems and not in isolation. MHRA GXP Data Integrity Guidance and Definitions; Revision 1: March 2018 Page 5 of 21 3.9 Appropriate notification to regulatory authorities should be made where significant data integrity incidents have been identified.
  • The guidance refers to the acronym ALCOA rather than ‘ALCOA +’. ALCOA being Attributable, Legible, Contemporaneous, Original, and Accurate and the ‘+’ referring to Complete, Consistent, Enduring, and Available. ALCOA was historically regarded as defining the attributes of data quality that are suitable for regulatory purposes. The ‘+’ has been subsequently added to emphasise the requirements. There is no difference in expectations regardless of which acronym is used since data governance measures should ensure that data is complete, consistent, enduring and available throughout the data lifecycle.

 

 

DEFINITIONS:

ALCOA: Attributable, Legible, Contemporaneously Recorded, Original & Accurate.

Attributable: This should include who performed an action and when.

Legible: All data recorded must be legible (readable) and permanent.

Contemporaneously: Contemporaneous means to record the result, measurement or data at the time the work is performed.

Original: Original data sometimes referred to as source data or primary data is the medium in which the data point is recorded for the first time.

Accurate: For data and records to be accurate, they should be free from errors, complete, truthful and reflective of the observation.

Raw Data: Original record and documentation, retained in format in which they were originally generated (i.e. paper or electronic) or a true copy.

Data Life Cycle: All phases in the life of the data (including raw data) from initial generation and recording through processing (including transformation or migration) use, data retention, archive/retrieval and destruction.

Original Record: Data as the file or format in which it was originally generated, preserving the integrity (accuracy, completeness, content and meaning) of the record, e.g. original paper record of  manual observation, or electronic raw data file from a computerized system.

Audit Trail: The audit trail is an integral requirement of an electronic record, ensuring the validity and integrity of the record and the link between any electronic signature and the record associated with it.

Metadata: A set of data that describes and gives information about other data. It provides information about a certain item’s content.

 

PROCEDURE (SOP):

  • All departments shall be verified during self inspection.
  • Apart from self inspection on line data integrity verification shall be carried out for manufacturing process and laboratory analysis. Same verification shall be documented in Annexures respectively.
  • All Department Head/ his designee, who authorized to conduct Self– Inspection/Internal, shall participate in Data Integrity Verification Process.
  • Data Integrity Verification Schedule is as per
  • Member of Data Integrity Verification Team shall verify or ensure the following but not limited to:
  • Documents should be readily available for review at operational place with all supporting and necessary documents.
  • There are no offline documentation practices, for e.g.
  • pick up any batch record or analytical data or other document and verify that all information recorded contemporaneously.
  • There is no practice of advance dating of document. e.g. pick up any batch record or analytical data or other document and ensure that there is information recorded for activities which are not executed yet.
  • Information and data recorded is permanent and legible.
  • There is no sign of data tampering and altering without proper authorizations. e.g. verify document and ensure that there is no data altered by erasing previous entries.
  • The data is properly supplemented with additional information (Metadata).
  • Signature on records is matching with relevant specimen signature. e.g. Collect few records and check the signature on record versus specimen signature.
  • There is no use of scrap paper for recording of official information before recording data on official records. e.g. Check the work place for any scrap paper. If observed, pick up them and ensure that there is no official information recorded on scrap paper before copying it to official records.
  • There is no overlapping in date & timing when multiple tasks handled by one person. e.g. take multiple document which were executed by one person verify whether there were multiple tasks and entries performed by one person at one particular point of time on the same day.
  • If any multiple tasks performed by one person on the same day with overlapping time, check whether execution of such multiple task by one person practically possible or not.
  • Attendance date & timing of employee are matching with the date & timing of document updated by him/her e.g. collect executed documents and verify the entries (date & time) made by person against his attendance.
  • Microbiology test specimen/plates/tubes are not discarded without recording results. e.g. Collect microbiology testing log books and cross verify with relevant incubator. Ensure that all microbiology test specimen/plates/tube are available as per record.
  • Password protection SOP should be available in the department and User ID should be different for each analyst.
  • Data Backup should be taken periodically and secure and data cannot be deleted.
  • There is no mismatch between saved data and printed data.
  • e.g. Select data logging/storage system such as building management system, data logger, and instrument, equipment’s etc. access the stored/saved data from memory of the instrument and cross check with printed/signed copies of same data.
  • Data Falsification and Data Fabrication. Ensure that there are no such practices.
  • After verification, record the observation (if any) on data integrity checklist.
  • All observation shall be summarized by Head Quality Assurance or his designee.
  • Head QA/designee shall share the observation through checklist to relevant department head.
  • Relevant department shall initiate the investigation followed by impact assessment, corrective and preventive action. All such discrepancies shall be addressed through data integrity checklist Annexure at Auditee response section.
  • Quality assurance shall track implementation of corrective and preventive action. And also quality assurance shall monitor corrective and preventive action for effectiveness (as per relevant site approved procedure).
  • Maximum timeline for the closure shall be 90 days.
  • Data Integrity verification shall be carried out during on line operation of production. Verification shall be carried out and documented in Annexure
  • Data integrity verification shall be carried out for conducted analysis of finished product and Raw materials. Verification shall be carried out and documented in Annexure
  • Verification shall be done for each and every batch of raw material and finished product analysis.

UNSCHEDULED VERIFICATION:

  • Relevant department shall submit checklist (with all supporting document and target completion dates wherever required) to quality assurance.
  • Head QA/designee shall review the response and all supporting documents. If there are any disagreements same shall be communicated to relevant department and re-inspection shall be planned.
  • If any person notice data integrity failures observer shall inform to Quality Assurance telephonically or through e-mail.

References

Computerised systems. In: The rules governing medicinal products in the European Union. Volume 4: Good manufacturing practice (GMP) guidelines: Annex 11. Brussels: European Commission.
OECD series on principles of good laboratory practice (GLP) and compliance monitoring. Paris: Organisation for Economic Co-operation and Development.
Good Clinical Practice (GCP) ICH E6(R2) November 2016
Guidance on good data and record management practices; World Health Organisation, WHO Technical Report Series, No.996, Annex 5; 2016.
Good Practices For Data Management And Integrity In Regulated GMP/GDP Environments – PIC/S; PI041-1(draft 2); August 2016.
MHRA GMP data integrity definitions and guidance for industry. London: Medicines and Healthcare Products Regulatory Agency; March 2015.
MHRA/HRA DRAFT Guidance on the use of electronic consent.
The Human Medicines Regulations 2012 (Statutory Instrument 2012 No. 1916):
EU Good Pharmacovigilance Practice Modules