Radiologists debate the merits of AI for work list triage in emergency settings
Radiologists are debating the merits of using artificial intelligence to help with work list triage in emergency settings.
Physicians with two academic institutions made their cases in dueling opinion pieces published Wednesday. AI has shown promise in neuroradiology, helping automate the prioritization of critical findings such as stroke, intracranial hemorrhage or spine fracture. Automation could be especially useful in eliminating emergency care delays and catching “subtle findings that might be missed in high pressure environments,” experts wrote in the American Journal of Roentgenology.
However, AI comes with potential downsides, including posing legal risks for radiologists.
“The implementation of AI triage systems raises complex questions about liability and responsibility,” Manoj Tanwar, MD, and Sandeep Bodduluri, PhD, with University of Alabama at Birmingham Heersink School of Medicine, wrote Feb. 12. “If an AI system fails to flag a critical case or incorrectly prioritizes a nonurgent study, then determining responsibility between the healthcare practitioner, the institution, and the AI vendor can become complicated.”
Radiology practices are already providing 24-hour coverage (including state studies) without the help of AI triage, whether through in-house staff or contracted physicians. These imaging groups can provide faster turnaround times, with long radiologist work lists “becoming less and less common,” the writers contended.
“Payment remains challenging, as it is unclear who will pay for AI tools,” Tanwar and Bodduluri added. “This issue is important given radiologists’ declining reimbursement over the past decade and the lack of improved speed from use of AI, at least in the setting of intracranial hemorrhage detection on CT.”
Radiologists Sowmya Mahalingam, MD, and Melissa Davis, MD, MBA, meanwhile took the counter argument in favor of AI triage. They emphasized the importance of understanding the strengths and weaknesses of triage tools to help guide each practice’s assessment of their cost-effectiveness and utility. Radiologists have complained of false positives from these systems, which can create “workflow inefficiencies and potential alert fatigue.”
To address this as the Yale School of Medicine where they work, their AI system for intracranial hemorrhage detection adds follow-up tags to exams to assess for stability of known ICH. Radiologists also can assess an AI-created heat map of the positive finding to decide whether it’s legitimate.
“For most practices, a small number of false-positive AI results would be an acceptable trade-off for the significant efficiency gains, including improved report turnaround times and overall ED throughput,” Mahalingam and Davis wrote.
In the past, technologists have helped notify radiologists about urgent findings or visibly deteriorating patients, but that’s unrealistic in 2025, given staffing concerns.
“While such interactions still occur, they cannot be relied upon as in the past given growing workloads, shifting technologist resources, and remote radiologist coverage,” the authors concluded. “With ever-increasing examination volumes and increased pressures on report turnaround times in the setting of a radiologist workforce shortage, AI solutions that help radiologists prioritize patients with more acute needs will be invaluable in achieving timely care.”