Only 4% of women comfortable with AI serving as sole reader of mammograms
Only about 4% of women are comfortable with artificial intelligence serving as the sole reader of their mammograms, according to new research published Friday.
Hospitals and radiology practices are rapidly adopting new technologies to help aid in breast cancer detection. However, the patient’s perspective often is overlooked in these discussions, experts write in Radiology: Imaging Cancer.
Researchers with the University of Texas Southwestern Medical Center in Dallas recently aimed to explore consumer sentiments around such software, surveying 518 women. While most opposed the idea of solo interpretation, about 71% of said they’d be OK with AI serving as a second reader.
"Patient perspectives are crucial because successful AI implementation in medical imaging depends on trust and acceptance from those we aim to serve," study author Basak E. Dogan, MD, director of breast imaging research at the UT Southwestern, said in an April 18 statement from the Radiological Society of North America (RSNA), which publishes the journal. "If patients are hesitant or skeptical about AI's role in their care, this could impact screening adherence and, consequently, overall healthcare outcomes."
Dogan and colleagues administered their 29-question survey between February and August of 2023, targeting all women undergoing screening mammography at their institution. Most were between the ages of 40 and 69 (about 73%), were college graduates (67%) and non-Hispanic white (51%). Only 23 of the 518 patients surveyed said they were comfortable with AI as a solo interpreter while 368 preferred the technology to be used as a second reader.
If they experienced an AI-reported abnormal screening, 89% of women said they’d want a radiologist review before scheduling a follow-up appointment. That’s compared to about 51% who said they wanted a radiologist-initiated recall reviewed by artificial intelligence. Higher educational attainment and knowledge about AI both were associated with greater acceptance of the technology. A patient’s race also was associated with higher concern for bias among Latino and black patients compared to white participants.
"These results suggest that demographic factors play a complex role in shaping patient trust and perceptions of AI in breast imaging,” Dogan said in the announcement.
Medical history also appeared to impact individuals’ trust in AI, “emphasizing the need for personalized AI integration strategies.” For instance, patients who had a close relative diagnosed with breast cancer were more likely to ask for additional reviews. But they showed a high degree of trust in AI and radiologist reviews, when a mammogram comes back as normal.
“Our study shows that trust in AI is highly individualized, influenced by factors such as prior medical experiences, education and racial background,” Dogan added. “Incorporating patient perspectives into AI implementation strategies ensures that these technologies improve and not hinder patient care, fostering trust and adherence to imaging reports and recommendations.”