top of page

New HHS Rule Issued to Monitor AI Tools Used by EHR Vendors for In-Hospital Patient Care

  • Writer: IBAOE
    IBAOE
  • Jan 30, 2024
  • 2 min read

Updated: Feb 1, 2024



Author: Fran Kritz

AMMI


The U.S. Department of Health and Human Services (HHS) recently issued a Final Rule that requires software vendors to disclose how the AI tools they use are trained, developed, and tested. The goal is to help protect patients against potentially harmful decisions when machine learning or a broader artificial intelligence algorithm (ML/AI) is used to help determine care.


According to the new rule, beginning in 2025, electronic health record (EHR) vendors who develop or supply ML/AI tools will be required to disclose more technical information to clinical users about their performance and testing. Vendors will also have to explain steps they have taken to manage potential risks. The new rules are part of a broader package of regulations to also help prevent EHR vendors and hospitals from exchanging digital patient information. The rule was written by the Office of the National Coordinator for Health Information Technology, which regulates the use of EHRs and patient data exchange.


“Given the proliferation of (AI products) used in healthcare and supplied by developers of certified health IT, we believe now is an opportune time to help optimize the use and improve the quality of AI and machine learning-driven decision support tools,” the new rules read.


Some EHR vendors plan to release or are already marketing AI models for hospital room monitoring that can predict whether a patient’s health is likely to worsen or develop significant complications such as sepsis. However, several high-profile cases have arisen in which vendors have been accused of overselling the capabilities of these programs. In other instances, patients have accused health systems of not disclosing the use of AI for informing decisions about their care.


Earning caretaker and patient trust alike remains one of the more daunting challenges for AI tool developers hoping to supplement the healthcare space. “When done right, AI has enormous potential,” noted Jesse Ehrenfeld, president-elect of the American Medical Association, professor of anesthesiology at the Medical College of Wisconsin, and co-chair of AAMI’s Artificial Intelligence Committee.


“Digital medicine has enormous opportunity to improve health outcomes. There is tremendous enthusiasm about disruptive innovation as long as it's clinically validated,” added Ehrenfeld. “We hear concerns that a lack of transparency will interfere with that trust [and interfere] with understanding how these tools were designed and validated.”


According to the HHS, the Final Rule does not apply to AI models developed by hospitals internally.


Commentaires


© 2023 International Biomedical Association of Education | All Rights Reserved | Terms and Conditions

bottom of page