U.S. Food and Drug Administration main campus building.

The U.S. Food and Drug Administration (FDA) recently announced it would develop a new review framework for AI and machine learning-based medical devices.

Currently, a medical device's algorithm has to be “locked” by the manufacturer before it is given approval by the FDA. “Locked” algorithms provide the same result each time the same input is given, while an adaptive algorithm, which utilizes AI, changes its behavior using a defined learning process.

To be sure, the FDA has previously approved some adaptive learning devices. In 2018, for example, the agency approved AI devices that detect an eye disease and provide stroke alerts. However, the FDA seems poised to extend that opportunity to all manufacturers and formalize the review process.

The agency wants to create a framework that would safely allow the AI in medical devices to make reactive choices without prior approval from the agency. The regulator discussed the topic in the 20-page “Proposed Regulatory Framework for Modifications to Artificial Intelligence/Machine Learning-Based Software as a Medical Device” discussion paper released on April 2.

The paper's proposed framework for devices that use adaptive algorithms is based on the International Medical Device Regulators Forum risk categorization, the FDA's benefit-risk framework, each software's risk management principles and the device maker's total product lifecycle as listed in the Digital Health Software Precertification program. The agency also proposes incorporating practices from its current premarket programs, including the 510(k), de novo and premarket approval pathways.

While some modifications to medical devices, including those that alter the device's usage, would need a new submission and approval from the FDA, other updates caused by a machine's adaptive AI wouldn't require additional review.

Medical device lawyers said FDA's announcement to provide a formalized framework for technology is a welcomed sign for manufacturers.

“The most important thing in [the paper] is it's adding some real meat to the bones, which they haven't really done before, with how you would interact with FDA in regards to modifications,” said Hogan Lovells partner Jonathan Kahan, who is co-director of the firm's medical device practice group.

“They basically said, if there's not really a significant amount of information [provided by the device] or that the health care decision isn't going to be based on it or how the health care professional will treat a disease, they probably aren't going to apply as high of a risk or moderation,” he added.

Medical device manufacturers will be expected to have an established quality system to develop, deliver and maintain the device and conform to standards and regulations. What's more, while the paper isn't a guideline or draft, it does offer some insight into the agency's thinking on how to encourage evolving technology while safeguarding patients' well-being.

“Although FDA says this is not the draft guidance and we have all these questions, they've given it significant thought,” said Linda Pissott Reig, who co-chairs Buchanan Ingersoll & Rooney's FDA section. “Now we can assess what their thinking is, what the pitfalls may be.”

When asked about the possible pitfalls of more adaptive AI powering medical devices, Reig pointed to media reports of IBM's AI engine Watson giving unsafe treatment recommendations for cancer patients. She said it was a cautionary tale of leveraging big data while not ensuring accurate outcomes. “I think the pitfalls that exist are, what are the modifications that are being proposed and is functionally truly vetted?” Reig said. “Is there a quality control approach?”

The collection, storage and analysis of a patient's data by an AI-based medical device may also pose a data privacy issue. But making sure medical devices only collect and store de-identified data may help ease privacy concerns, Reig said.

“My sense is that we could enter a scenario where the public health goal could be viewed as more significant in the opportunities that are offered than [in] the personal privacy industry,” she said.