Enforcement of rule regulating discrimination in AI use will start May 1, ACR warns

The feds are slated to begin enforcing a new rule regulating discrimination in the use of artificial intelligence on May 1, the American College of Radiology recently warned members of the specialty. However, it’s still uncertain whether the Trump administration will follow through. 

Health and Human Services issued the final rule in May 2024, which relates to Section 1557 of the Affordable Care Act. It includes new regulations, mandating that providers and payers must make “reasonable efforts” to mitigate discrimination related to AI and other clinical decision support tools. 

The regulation covers most hospitals, clinics, physician practice groups and academic medical centers, experts wrote in Health Affairs last year. All were slated to have until the end of this month to meet the requirements, but it remains unclear whether the White House will move forward with the regulation. 

“While covered entities must comply by May 1, changing policy priorities, the removal of several online compliance resources and a significant department reorganization have made HHS’ future enforcement intentions less clear,” the American College of Radiology said in an April 11 news update.  

Experts note that the rule applies to a broad array of clinical tools, both automated and not, used for screening, risk prediction, diagnosis and treatment planning. HHS had issued a fact sheet and FAQ about the updates a year ago, but both appeared to be taken down as of Friday. ACR noted that the rule would require documentation of efforts to ensure that AI and other such tools do not lead to patient discrimination based on race, sex, age, disability, etc. 

HHS had indicated previously that it would review any violations on a case-by-case basis, ACR noted. The college encouraged impacted entities to consider working with their legal compliance teams to “better understand and comply” with these requirements. In November, experts had predicted that the likelihood of “extensive federal enforcement” of this rule is challenged by “profound staffing and funding limitations” within the Health and Human Services Office of Civil Rights.

“ACR will continue to monitor the situation at HHS OCR for further developments,” the college noted. 

McDermott Will & Emery also published a blog in January offering further details about the rule. The law firm said several state legislatures also have enacted or prosed legislation to regulate deployment of AI and other clinical decision support tools. States such as Illinois, Colorado and California have taken “vastly different” approaches to regulation. Absent a uniform legislative approach, MW&E urged organizations to ask questions such as: 

  • Does your tool contain safeguards against algorithmic discrimination?
  • Could the data used to train the tool lead to inadvertent algorithmic discrimination?
  • Is there a process in place to evaluate whether algorithmic discrimination is taking place?
  • Is there appropriate documentation in place to show efforts to evaluate each of the foregoing?

Even if the feds decide to rewrite the regulation, McDermott Will & Emery urged provider organizations to pay attention and consider taking steps to ensure AI is not used in a way that could perpetuate discrimination. 

“Although the Trump administration is likely to revise or repeal Section 1557 regulations in their entirety, efforts that covered entities take toward meeting the May 1, 2025, patient care decision support tool compliance date will lay strong groundwork for meeting state and federal government expectations for use of AI as part of patient care,” experts wrote Jan. 16. 

Marty Stempniak

Marty Stempniak has covered healthcare since 2012, with his byline appearing in the American Hospital Association's member magazine, Modern Healthcare and McKnight's. Prior to that, he wrote about village government and local business for his hometown newspaper in Oak Park, Illinois. He won a Peter Lisagor and Gold EXCEL awards in 2017 for his coverage of the opioid epidemic. 

Around the web

News of an incident is a stark reminder that healthcare workers and patients aren’t the only ones who need to be aware around MRI suites.

The ACR hopes these changes, including the addition of diagnostic performance feedback, will help reduce the number of patients with incidental nodules lost to follow-up each year.

And it can do so with almost 100% accuracy as a first reader, according to a new large-scale analysis.