Scientists devise automated, ‘big data’ approach to assess radiology residents’ report writing progress

Swiss scientists have devised an automated, “big data” approach to assessing radiology residents’ report writing progress over time, sharing their work Tuesday in Radiology [1].

Learning to compose a clear, succinct, high-quality report is one of the most crucial skills developed during training, experts note. However, there is little formal instruction, with many forced to learn in an unorganized way that lacks systematic evaluation. 

Researchers with the Department of Radiology at University Hospital Basel aimed to fill this void, retrospectively analyzing nearly 250,000 radiology reports from 53 residents at their institution over five years. They used the Jaccard similarity coefficient, an analytical statistic for gauging similarities between two documents. 

The analysis compared residents’ initial draft (the unsupervised first attempt) or preliminary (following joint readout with an attending physician) reports with the faculty-reviewed final version. Similarities between the initial and final forms increased about 6% from first- to fifth-year residents. Experts discovered a strong correlation between resident experience and higher report similarity. 

“This study successfully applied data mining to discern patterns in the evolution of resident report writing skills, demonstrating that the Jaccard similarity coefficient can serve as a proxy for improvements in report quality over time,” Jan Vosshenrich, MD, a consulting radiologist with University Hospital Basel, Switzerland, and co-authors wrote Sept. 24. “The observed increase in report similarity and decrease in [standard deviation] point to a pattern in reporting performance as residents progress through their training. Data could potentially be used to establish performance benchmarks and better comprehend the developmental trajectories of residents, although variability among subspecialities and imaging modalities suggests that one size does not fit all.”

The similarity between residents’ draft reports and the final version increased by about 14% over the course of a five-year residency program. This trend persisted across all imaging divisions and modalities. “Most notably,” the authors wrote, preliminary report similarity for fifth-year residents was 14% greater than for first-year trainees. 

In a corresponding editorial [2], radiologist Michael A. Bruno, MD, called the results “impressive.” “Not surprisingly,” he added, the largest improvements were seen in the divisions that used freeform-style reports. 

“One could envision the method enabling a program of continuous quality improvement, using iterative plan-do-study-act cycles, a standard methodology in quality improvement, to refine the teaching and learning of report writing in a rigorous, data-driven manner, and this points to a direction for future research using their novel tool,” wrote Bruno, a professor, vice chair for quality and chief of emergency radiology at Penn State University. “In comparison with previously published strategies, this new ‘big data’ approach allows for large-scale automation, which minimizes sampling errors, and it importantly does not rely on any subjective human observer judgements. It therefore presents clear advantages over previously published rating scales as a metric for radiology resident report quality in terms of sample size, objectivity and reducing bias.” 

Read more about the results, including study limitations, at the links below. 

Marty Stempniak

Marty Stempniak has covered healthcare since 2012, with his byline appearing in the American Hospital Association's member magazine, Modern Healthcare and McKnight's. Prior to that, he wrote about village government and local business for his hometown newspaper in Oak Park, Illinois. He won a Peter Lisagor and Gold EXCEL awards in 2017 for his coverage of the opioid epidemic. 

Around the web

After reviewing years of data from its clinic, one institution discovered that issues with implant data integrity frequently put patients at risk. 

Prior to the final proposal’s release, the American College of Radiology reached out to CMS to offer its recommendations on payment rates for five out of the six the new codes.

“Before these CPT codes there was no real acknowledgment of the additional burden borne by the providers who accepted these patients."

Trimed Popup
Trimed Popup