ACR offers resources to achieve radiology AI best practices

 

The American College of Radiology has been an advocate in the forefront for radiology artificial intelligence and is helping guide radiologists as they go through the vetting process for algorithms and offers tools to help them adopt the new technology.

The U.S. Food and Drug Administration as of July  had cleared 950 clinical AI algorithms. Of these, radiology by far had the largest share with 723 indicated specifically for medical imaging. However, many of the algorithms cleared for other specialities like cardiology, neurology and orthopedics, also involve post-processing of medical images. Understanding what to look for and how to evaluate  AI has become a new frontier for most radiologists.

"The ACR had the foresight back in 2017 to actually found a Data Science Institute. I think it was really forward thinking, and that led us on the path to really establish that leadership position in AI," explained ACR's CEO Dana H. Smetherman, MD, MBA.

The institute offers a complete and up to date searchable directory of commercially available imaging AI products in the U.S. The goal of the database is to empower medical professionals to access more-transparent AI product information and make better AI purchases. This includes lists for AI specific to abdominal, breast, cardiac, dental, chest, musculoskeletal, neuroradiology and pediatric imaging. It also lists AI by product, vendor and AI platforms, and offers an overview of the current AI landscape. ACR says thousands of radiologists per month are accessing the site in search of suitable AI solutions.

The DSI's Transparent-AI is a free program composed of data elements voluntarily provided by manufacturers to drive more transparency about Imaging AI algorithms. This includes how they were developed and validated. The program is designed to help end users have better informed discussions and selection criteria when making decisions about which algorithm might be most appropriate for their local target population. Manufacturers participating in the program will have the Transparent-AI badge displayed on their product tiles and profiles.

The transparency content is posted on the DSI website and includes model identification, characteristics, indications for use, performance, training details and limitations. Examples of data elements include intended user, age range, scanner manufacturer and scanner models used in standalone performance. The instructions for use, an FDA-required document that already contains much of the discrete data referenced above, is also included for each product.

"Although very powerful, these tools are not 100% accurate and they can drift. I think we need to have a way for the average radiology practice to be able to know if they get an AI product, that it's really going to work the way that they think it's going to work. So the ACR is going to continue down this pathway to try to make sure that radiologists and the patients can have the reassurance that when an AI product is used, that it's going to function correctly and help in the way that it's supposed to help," Smetherman said.

ARCH-AI enables radiologists to compare AI products

The most recent thing that the institute announced was the ACR Recognized Center for Healthcare-AI (ARCH-AI) program to review and vet AI products. It is the first national AI quality assurance program for radiology facilities designed to recognize adherence to best practices for use of AI in imaging interpretation. The program outlines expert consensus-based building blocks of infrastructure, processes and governance of AI implementation in real-world practice.

By working toward, and attesting to, compliance within the tenets of the program, participation in ARCH-AI can help radiology practices provide safe and effective implementation of AI products and help radiologists provide better patient care, the ACR said.

Radiology practices that complete the ARCH-AI process will receive an ACR Recognition badge to display in their waiting rooms and lobbies to demonstrate to their communities, patients, payers and referring physicians that they are committed to integrating AI in a safe, responsible manner that allows them to provide the best possible  healthcare.

The criteria to be considered an ARCH-AI site include:
  • Establishing an interdisciplinary AI governance group.
  • Maintaining an inventory of AI algorithms with detailed documentation.
  • Ensuring adherence to security and compliance measures.
  • Engaging in diligent review and selection of AI algorithms.
  • Documenting use cases and training procedures.
  • Monitoring algorithm performance, including safety and effectiveness.
  • Participating in the Assess-AI national AI registry for performance benchmarking.

Find more radiology AI news and video 

Dave Fornell is a digital editor with Cardiovascular Business and Radiology Business magazines. He has been covering healthcare for more than 16 years.

Dave Fornell has covered healthcare for more than 17 years, with a focus in cardiology and radiology. Fornell is a 5-time winner of a Jesse H. Neal Award, the most prestigious editorial honors in the field of specialized journalism. The wins included best technical content, best use of social media and best COVID-19 coverage. Fornell was also a three-time Neal finalist for best range of work by a single author. He produces more than 100 editorial videos each year, most of them interviews with key opinion leaders in medicine. He also writes technical articles, covers key trends, conducts video hospital site visits, and is very involved with social media. E-mail: dfornell@innovatehealthcare.com

Around the web

After reviewing years of data from its clinic, one institution discovered that issues with implant data integrity frequently put patients at risk. 

Prior to the final proposal’s release, the American College of Radiology reached out to CMS to offer its recommendations on payment rates for five out of the six the new codes.

“Before these CPT codes there was no real acknowledgment of the additional burden borne by the providers who accepted these patients."

Trimed Popup
Trimed Popup