Regulatory expectations when using third party AI products in GMP


Regulatory expectations when using third party AI products in GMP

Published on 05/12/2025

Regulatory Expectations When Using Third Party AI Products in GMP

The increasing integration of Artificial Intelligence (AI) into Good Manufacturing Practice (GMP) environments presents both opportunities and challenges for regulatory professionals. As organizations begin to implement AI and machine learning (ML) platforms from external vendors, understanding the regulatory landscape is essential. This comprehensive manual offers a structured overview of relevant regulations, guidelines, and agency expectations concerning vendor qualification audits for AI/ML products in GMP settings.

Context

Regulatory Affairs (RA) serves as a vital bridge between companies and regulatory authorities. In the context of AI in Quality Systems, particularly for GxP (Good Practice) environments, RA professionals must ensure that AI systems meet applicable regulations while also supporting quality and efficacy in manufacturing processes. The rise in cloud AI utilization has introduced new dimensions of risk and complexity that must be navigated with a robust vendor qualification process.

Legal/Regulatory Basis

The regulatory basis for using AI in GMP is derived from several key frameworks and guidelines set forth by regulatory authorities, including the FDA (Food and Drug Administration), EMA (European Medicines Agency), and MHRA (Medicines and Healthcare products Regulatory Agency).

  • 21 CFR Part 820: This
regulation outlines the quality system requirements for medical device manufacturing, emphasizing the need for effective supplier management, including third-party AI tools.
  • EU GMP Guidelines (EudraLex Volume 4): These guidelines detail expectations for ensuring the quality and safety of medicinal products and highlight the importance of vendor validation in line with quality assurance principles.
  • ICH Q7: This guideline pertains to Good Manufacturing Practice for Active Pharmaceutical Ingredients, underscoring the need for proper material management and compliance when involving third-party vendors, including those providing AI solutions.
  • FDA Guidance on Software as a Medical Device (SaMD): This document discusses regulatory considerations specific to software used in a medical capacity, relevant for AI tools used to inform quality control decisions.
  • Documentation

    Effective documentation is a cornerstone of compliance in GMP environments when utilizing third-party AI products. Here are critical components that must be documented:

    Vendor Qualification Documents

    Documentation must include comprehensive vendor qualification records that outline the rationale for selecting a specific AI vendor. Necessary elements include:

    • Vendor profile: Detailed information about the vendor, including their experience, product offerings, and past audit results.
    • Risk assessment: Identify potential risks associated with the AI technology and vendor operations that may impact product quality.
    • Technical specifications: Capture the functionality of the AI product, including architecture and algorithm behavior.

    Auditing Records

    Conducting a thorough audit is essential in qualifying a vendor. The following records should be maintained:

    • Audit plan and checklist: Develop a targeted audit plan focused on AI capabilities, including data integrity, algorithm transparency, and operational controls.
    • Audit findings: Document all observations, deviations, and required corrective actions during the audit process.

    Change Control Documents

    When changes occur, a robust change control process needs to be followed:

    • Change request forms: Maintain comprehensive forms detailing proposed changes to the AI systems or services provided by vendors.
    • Impact assessments: Evaluate how the changes may impact existing processes and compliance.

    Review/Approval Flow

    The review and approval flow for AI products involves various steps, all of which must be well-defined and communicated across the organization:

    Initial Vendor Evaluation

    The evaluation process should start with an initial screening based on the vendor’s credentials, compliance with regulatory requirements, and past performance.

    Risk-Based Approach

    Utilizing a risk-based approach to assess the vendor’s AI solutions is critical. Consider the following:

    • Conduct a risk assessment aligned with the defined grading scale to evaluate potential impact and likelihood of failure.
    • Prioritize higher-risk vendors for a more in-depth evaluation and regular audits.

    Audit Approval

    Once the audits are complete, the findings must be reported to the responsible QA or Regulatory Officer. Approval for vendor use should follow:

    • Compile a report summarizing the audit findings.
    • Present the findings in a management review board meeting for final approval.

    Common Deficiencies

    Identifying common deficiencies in vendor qualification audits can help mitigate risks and improve compliance:

    Inadequate Documentation

    One of the most prevalent issues is the lack of adequate documentation to support vendor qualification. Ensure all documentation referenced earlier is completed thoroughly and accurately.

    Poor Audit Practices

    It is critical that audits are conducted systematically. Common deficiencies include insufficient depth in the audit process and inadequate follow-up of corrective actions.

    Failure to Address Regulatory Changes

    Regulatory environments are evolving rapidly, especially concerning technologies like AI. Organizations must remain vigilant in reviewing and incorporating updated regulatory guidance.

    RA-Specific Decision Points

    Regulatory Affairs professionals must navigate complex decision points when engaging with AI vendors which can significantly impact compliance and operational effectiveness:

    Determining Filing Type

    RA professionals should establish when to file an application as a variation versus a new application:

    • Variation Filing: If the AI solution leads to minor changes in data processes or analytical methods that do not significantly impact product identity, strength, route of administration, or dosage form.
    • New Application: If the AI implements a fundamentally different approach or alters intended use, a new application should be considered.

    Bridging Data Justification

    Justification for bridging data can be critical when utilizing AI solutions:

    • Demonstrate how existing data will support or infer performance characteristics relevant to the AI implementation.
    • Document the rationale for bridging data use comprehensively, including regulatory guidance references that support this approach.

    Conclusion

    As AI technology continues to evolve, regulatory professionals in the pharmaceutical and biotech industries must adapt to the associated complexities. Implementing third-party AI applications within GMP frameworks necessitates a robust vendor qualification audit process aimed at compliance and quality assurance. By understanding the regulatory expectations and maintaining high standards, organizations can successfully leverage AI technologies while meeting the legal obligations set by regulatory authorities.

    For further guidance, consider consulting the FDA Guidance on Software as a Medical Device, the ICH Q7 Guidelines, and the ICH Quality Guidelines.

    See also  Contract terms to address AI model risk and lifecycle responsibilities