AI REG REACTION — HHS has new rules set to take effect in 2025 mandating artificial intelligence developers reveal more about how their algorithms work, Ben reports. There’s support in the industry, but Pulse also heard skepticism about how effective the rules might be and questions about liability and scope. The backstory: The Office of the National Coordinator for Health IT unveiled sweeping rules last week for AI used in most hospitals and doctor's offices. In short, the ONC regulations will require software developers to provide more data to customers to help providers determine whether AI products are “fair, appropriate, valid, effective and safe.” The reaction: Major groups including the Coalition for Health AI — with members including Google, Microsoft and Duke Health — praised the rules. “Putting some standards in place is really important,” Michael Pencina of Duke AI Health and the coalition’s co-founder told Pulse. “They found the right balance, for the most part, about putting things forward but not being overly prescriptive.” Agency interaction: Still, there are questions about how ONC and the FDA, which also regulates AI-enabled medical devices, will work together. Cybil Roehrenbeck, executive director of the AI Healthcare Coalition, told Pulse that she wished ONC would treat products already regulated by the FDA differently. The FDA’s scrutiny should count for something, Roehrenbeck said. ONC said in its rule that it worked with the FDA to align regulations to reduce the compliance burden for AI developers covered by both agencies’ rules. Liability: Roehrenbeck said that the ONC rules’ reliance on individual clinicians to make calls about AI’s trustworthiness also raises liability concerns. ONC said those are outside of the scope of its rule. “If a medical device fails, we know how to move through that process,” Andrew Tomlinson, director of regulatory affairs at the American Health Information Management Association, told Pulse. “We need to have that same process for AI.” And Roehrenbeck said she's gotten several questions about what algorithms the rules apply to, and that she's hopeful for more clarity. The agency said the rule has a broad scope, covering models not directly involved in clinical decision-making that can affect care delivery, like those aiding supply chains. An ONC spokesperson said the agency appreciates the “robust public feedback” and welcomes more of it. WELCOME TO OUR LAST EDITION OF PULSE FOR 2023. Thanks for your readership, feedback, and tips all year! We’ll be back in 2024. Please keep sending your tips, scoops and feedback to ccirruzzo@politico.com and bleonard@politico.com and follow along @ChelseaCirruzzo and @_BenLeonard _. TODAY ON OUR PULSE CHECK PODCAST, host Chelsea Cirruzzo talks with POLITICO health care reporter Daniel Payne about the ways artificial intelligence is already used across the medical landscape and how regulators are responding.
|