Clinical decision support has little impact on image-ordering behaviors, single-center study finds

Clinical decision support appears to have little impact on image-ordering behaviors, according to a new single-center study published Thursday in the Journal of the American College of Radiology [1].

The Protecting Access to Medicare Act requires outpatient and emergency department providers to deploy such CDS systems to curb the use of unnecessary imaging. But do they actually work? Researchers with Washington University in St. Louis, Missouri, aimed to answer this question, analyzing 17,355 imaging orders logged at their ED over a six-month period.

Govind S. Mattay MD, MBA, et al. reported little impact after implementation—including not overburdening referrers—but believe AI could help improve the numbers.

“CDS alerts triggered by low [appropriate use criteria] scores caused minimal increase in time spent on imaging order entry, but had a relatively marginal impact on imaging study selection,” Mattay, with WU’s Mallinckrodt Institute of Radiology, and colleagues advised. “[Artificial intelligence predictive text] implementation increased the number of scored studies and could potentially enhance [clinical decision support’s] effects.”

Washington University utilizes CareSelect Imaging CDS, which requires providers to order advanced imaging through the electronic medical record. Clinicians have the option to select from a common list of structured indications for studies or input a free-text alternative. For those that use structured reporting, CDS automatically scores the request on a scale from 1 (least appropriate) to 9 (most appropriate). It further categorizes them into color ranges including red (scores 1-3, indicating “usually not appropriate”), yellow (scores 4-6, denoting “may be appropriate”), or green (7-9, “usually appropriate”).

The CDS system triggered alerts for 3% of all imaging orders during the study period (522 of 17,355), suggesting other exams with higher appropriateness scores. Ordering providers selected an alternative study based on such suggestions for just 3% these scenarios (18/522) or about 0.1% of all imaging orders (18/17,355). Out of the 18 changed orders, 15 related to contrast administration (e.g., magnetic resonance imaging of the cervical spine with/without contrast switched to just MRI without). The other 3 pertained to ordering additional unnecessary studies for trauma cases.

In the 522 instances where CDS alerts were triggered, clinicians spent an additional 25 seconds interacting with the system. Most common reasons for not following CDS’s suggestions included disagreement with the appropriateness score (32%) or “requested by a consultant” (25%).

Washington University later implemented AI predictive text in July 2021. This model uses free-text input, crunching additional patient and provider data to suggest a structured indication. If the correlation between the free-text entry and proposed structured alternative is strong enough, the system automatically converts the former into the latter. After launch of this AI tool, the percentage of unscored studies dropped from 81% to 45%.  

Washington University experts see broader uses for artificial intelligence in fine-tuning clinical decision support.

“Future AI tools could analyze CDS-derived and EMR data on a population-level to not only reduce inappropriate imaging utilization but also make suggestions on optimal screening imaging studies for specific patient cohorts,” the authors wrote. “CDS could also be enhanced by additional AI tools that analyze data at a wholistic patient level to consider additional factors such as other diagnoses and allergies to make better imaging order suggestions. As these CDS versions emerge, it will be paramount to analyze their effect on clinical care.”

Read more from the study, including potential limitations, at the link below.

Marty Stempniak

Marty Stempniak has covered healthcare since 2012, with his byline appearing in the American Hospital Association's member magazine, Modern Healthcare and McKnight's. Prior to that, he wrote about village government and local business for his hometown newspaper in Oak Park, Illinois. He won a Peter Lisagor and Gold EXCEL awards in 2017 for his coverage of the opioid epidemic. 

Trimed Popup
Trimed Popup