AI Adds Complexities to Real-World Data in FDA Drug Approvals

Sept. 17, 2024, 9:05 AM UTC

Physician notes, medical images, and lab test results derived from electronic health records are increasingly making their way into the FDA’s drug and biologic approval or post approval process—largely due to AI.

“There is a huge appetite for high-quality real-world data and real-world evidence to support product applications,” said Nicholaas Honig, head of regulatory and senior counsel at Aetion. “There’s becoming a lot more of it, and sponsors are really thinking through how this can support approvals in a number of different ways.”

Real-world data, as described by the Food and Drug Administration, relates to the patient’s health status or the delivery of health-care routinely collected from a variety of sources. While it doesn’t replace randomized clinical trials—the FDA’s gold standard used to determine whether a product is safe and effective—it’s considered valuable as it can provide insight into other research questions that may be unanswerable through the traditional approach.

The FDA in July finalized guidance on harnessing EHRs and medical claims data in clinical studies to support drug and biologic regulatory decision making. Notably, the guidance acknowledged artificial intelligence when collecting and analyzing such data, as more drug sponsors and the companies they work with gravitate toward using advanced technology tools.

Life sciences attorneys expect AI to emerge as a double-edged sword as real-world data can be used to extract critical health information for studies used in the approval or post approval process, while also adding more scrutiny to the product’s application when reviewed by the FDA.

“The FDA wants to make sure that AI isn’t just a black box that people are relying on,” said Sonia Nath, chair of Cooley LLP’s global life sciences and health-care regulatory practice group.

“There’s a lot of real-world data out there, and with AI becoming more prevalent in the health-care industry, there’s going to be even more access to it,” Nath said.

The agency is slated to release guidance by the end of the year on the use of AI and machine learning to support drug development, which some attorneys say will shed light on how the FDA is thinking about the technology’s use.

Not a ‘Black Box’

The FDA, as required by the 21st Century Cures Act, laid out in its July guidance various expectations for drug and biologic sponsors to ensure the integrity of the real-world data collected and the processes used to evaluate it.

The FDA also recognized how AI “may permit more rapid processing of unstructured electronic health care data.”

This is because EHRs are often unstructured when collected, as the source information comes from doctors’ notes and lab work. Medical claims data—information submitted to insurers to receive payment for treatments—can also be unorganized when collected. Drug sponsors typically work with companies or vendors who collect EHRs and medical claims data and then organize it to provide or sell to the sponsor.

The FDA in its guidance—while not endorsing a specific AI technology—said tools can include natural language processing, machine learning, and in particular deep learning to extract data elements from unstructured text in EHRs; develop computer algorithms that identify outcomes; or evaluate images or laboratory results.

“They’re very cautious about using RWD, but certainly they opened the door now,” said Xin Tao, head of the food and drug law practice at Baker & McKenzie LLP.

“With the use of AI, I think this will be very attractive to certain companies, especially expanding indications, labeling changes, and maybe sometimes narrowing indications to avoid adverse effects,” Tao said.

But any AI methods used in regulatory decisions before the FDA “require a significant amount of human-aided curation and decision-making,” the agency wrote in it’s guidance.

The FDA warned AI can inject “an additional level of data variability and quality considerations into the final study-specific dataset.”

“It’s a useful tool, but it’s just that,” Honig said. “It’s a tool that needs to be used with deliberate intentions and with an end purpose in mind.”

Life science attorneys say sponsors should consider the FDA’s caution that data captured within an EHR system or network may not represent comprehensive care. EHR data may also not accurately reflect the presence, characteristics, or severity of a particular disease, according to the FDA.

Those warnings, while also adding AI into the mix, should encourage sponsors to be cautious when selecting data sources, attorneys say.

“The challenge is that both real world evidence and AI are things that FDA is just now having to develop policies and approaches on,” said Sarah Blankstein, counsel in Ropes & Gray LLP’s life sciences regulatory and compliance practice.

“Their understanding and approach is still evolving. So using AI tools and real-world data is going to add some additional complexity and uncertainty into an application,” Blankstein added.

Multiple Parties at Stake

Health attorneys are also eyeing how the FDA will oversee industry responsibilities when AI is used in regulatory decisions.

While the FDA’s guidance says it is intended to provide sponsors and other interested parties with considerations when proposing to use EHRs or medical claims data in clinical studies, drug sponsors are primarily responsible for bringing the application to the agency explaining the evidence.

“It’s not just a single party here. It’s multiple parties,” Tao said.

How do you delegate responsibilities to make sure it’s the least burdensome for the industry to use this approach? Because if you just put this on the sponsors, they may not be in the best position to do this,” he said.

The FDA said if sponsors use AI or other derivation methods, the protocol should specify the assumptions and parameters of the computer algorithms used; the data source from which the information was used to build the algorithm, whether the algorithm was supervised or unsupervised; and the metrics associated with validation of the methods.

“This guidance underscores that they really will need to demonstrate a lot of rigor and discipline in connection with their collection and curation and use of real-world data and real-world evidence,” said Jiayan Chen, partner at McDermott Will & Emery.

“AI holds a lot of promise for purposes of being able to derive insights from the data itself—the volume of information and the unstructured nature of many datasets are challenges that AI can help tackle,” Chen said.

“You still need to be very careful when you’re both selecting the data and now when you’re using AI tools to make sure that you are guarding against potential biases and harms.”

To contact the reporter on this story: Nyah Phengsitthy in Washington at nphengsitthy@bloombergindustry.com

To contact the editors responsible for this story: Zachary Sherwood at zsherwood@bloombergindustry.com; Brent Bierman at bbierman@bloomberglaw.com

Learn more about Bloomberg Law or Log In to keep reading:

See Breaking News in Context

Bloomberg Law provides trusted coverage of current events enhanced with legal analysis.

Already a subscriber?

Log in to keep reading or access research tools and resources.