Published Date : 17/07/2025
The International Society for Pharmaceutical Engineering (ISPE) has released its Good Automated Manufacturing Practice (GAMP) Artificial Intelligence Guide. This guide is designed to be compatible with existing frameworks, including the International Organization for Standardization (ISO) frameworks related to AI and medical device software.
The document seeks to provide readers with best practices for using AI-enhanced computer systems to promote patient safety, product quality, and data integrity. It offers principles for designing and operating AI systems in a regulated environment.
The scope of the guide is broad, as the authors wrote it to advise a wide range of companies and workers, including drug developers, suppliers, and regulators. It applies to various AI-enabled computer systems and medical devices, with specific discussions aimed at different systems.
The main body of the guide provides key concepts about using GxP-regulated AI systems and discusses the life cycle from concept through retirement. Further chapters delve into quality risk management (QRM), the role of suppliers in supporting compliance, and governance frameworks. A set of appendices supports the main body, offering practical guidance on a number of topics.
The guide was developed with many GxP applications in mind while targeting a global readership with a diverse set of needs. It offers advice to help companies comply with global governmental Acts and regulatory authorities, specifically listing the US Food and Drug Administration (FDA) and UK Medicines and Healthcare products Regulatory Agency (MHRA) among others.
The guide addresses the European AI Act, which was formally adopted by the European Council in May 2024 and is touted as “the first comprehensive regulation on AI by a major regulator anywhere.” The Act organizes AI applications into broad categories, defining them as prohibited unacceptable risk applications, regulated high-risk applications, and lightly regulated general-use applications with limited risk. The guide offers good practices that may help companies comply with the Act while also addressing even low-risk applications in pursuit of prioritizing patient safety.
Brandi Stockton, founder and managing partner of the Triality Group, led the project, working with a team of researchers, developers, and other industry professionals, together encompassing a wide range of expertise. Among those experts is Taylor Chartier, CEO of Modicus Prime, who spoke with BioProcess Insider last year when the guide was under development.
“The guide was developed in response to the industry's need for practical, actionable guidance on the application of AI and machine learning (ML) in GxP-regulated areas,” Chartier told us at the time. “The primary aim of the guide is to ensure that AI technologies are implemented in a way that maintains patient safety, product quality, and data integrity, which are the chief concerns in the healthcare life sciences industry.”
The GAMP Artificial Intelligence Guide is the latest release in the ISPE’s series of guidance documents that the organization has produced to provide pharmaceutical manufacturing professionals with practical guidance to improve workflow and better meet regulatory standards. In June, the ISPE released Validation 4.0, with many more topics covered on the website. Each guide has a different set of authors with knowledge and experience tailored to the topic at hand.
Q: What is the main purpose of the GAMP Artificial Intelligence Guide?
A: The main purpose of the GAMP Artificial Intelligence Guide is to provide best practices for using AI-enhanced computer systems to promote patient safety, product quality, and data integrity in the pharmaceutical industry.
Q: Who is the target audience for the GAMP Artificial Intelligence Guide?
A: The guide is designed to advise a wide range of companies and workers, including drug developers, suppliers, and regulators.
Q: What regulatory frameworks does the guide address?
A: The guide addresses global governmental Acts and regulatory authorities, specifically listing the US FDA, UK MHRA, and the European AI Act.
Q: How does the guide categorize AI applications?
A: The guide categorizes AI applications into prohibited unacceptable risk applications, regulated high-risk applications, and lightly regulated general-use applications with limited risk.
Q: What is the role of suppliers in the guide?
A: The guide discusses the role of suppliers in supporting compliance and provides guidance on governance frameworks.