Lab uses eye-tracking device, AI to study impact of contextual bias on radiologists interpreting mammograms

Radiologists are “significantly influenced” by contextual bias when interpreting mammograms, according to a new study published in the Journal of Medical Imaging.

“When humans are faced with uncertainty while engaging in predictive or discrimination tasks, which are sequential in nature, there is a tendency for perceived patterns to subconsciously influence or bias human decisions based on previous states within the sequence,” wrote Gina Tourassi, team lead and director of the Health Data Science Institute at the U.S. Department of Energy-sponsored Oak Ridge National Laboratory (ORNL) in Oak Ridge, Tennessee, and colleagues. “In addition to being sequential in nature, medical decisions involve individual cases whose pathologies are independent of one another, thereby creating additional potential for bias, in which the practitioner may erroneously (subconsciously) assume that future probabilities are affected by current outcomes (such as mentally adjusting the prior probability distribution for a given pathology after the occurrence of a streaky sequence).”

Tourassi et al. studied 10 radiologists with various levels of experience as they analyzed 100 mammograms from a University of South Florida database. Each radiologist wore a head-mounted eye-tracking device that recorded their “raw gaze data” as they interpreted the images.

Overall, the authors determined that “breast parenchymal density, visual behavior and diagnostic decisions in previous cases may serve as predictors of current diagnostic decisions indicating contextual bias in radiologists’ review and diagnosis of mammographic images in testing situations,” though they were unable to qualify these claims in clinical practice.

“In the event that these observations apply in clinical practice, a deeper understanding of how these biases occur, and additional factors, which improve predictability of these biases, will be invaluable in improving training methodology and reducing the occurrence of errors in diagnostic imaging,” they added.

In a news release from the ORNL, Tourassi emphasized the potential importance of these findings moving forward.

“These findings will be critical in the future training of medical professionals to reduce errors in the interpretations of diagnostic imaging and will inform the future of human and computer interactions going forward,” she said.

The release also revealed another layer of the team’s research—that they needed artificial intelligence (AI) to go through all their data. One of the country’s most powerful supercomputers, Titan, happens to belong to the ORNL, which helped the team considerably. Numerous AI methods—including convolutional neural networks (CNNs), deep neural networks and deep belief networks—were all used to interpret the data. And according to a second study, published in the Journal of Human Performance in Extreme Environments, CNNs proved to be the most helpful.

Tourassi noted that this research may have been focused on radiology specifically, but the data she and her team collected has potential to help other industries learn more about the influence of contextual bias.

Michael Walter
Michael Walter, Managing Editor

Michael has more than 18 years of experience as a professional writer and editor. He has written at length about cardiology, radiology, artificial intelligence and other key healthcare topics.

Around the web

The patient, who was being cared for in the ICU, was not accompanied or monitored by nursing staff during his exam, despite being sedated.

The nuclear imaging isotope shortage of molybdenum-99 may be over now that the sidelined reactor is restarting. ASNC's president says PET and new SPECT technologies helped cardiac imaging labs better weather the storm.

CMS has more than doubled the CCTA payment rate from $175 to $357.13. The move, expected to have a significant impact on the utilization of cardiac CT, received immediate praise from imaging specialists.