Most women have yet to form an opinion about breast imaging AI
Most women have yet to form an opinion about the use of artificial intelligence in breast imaging, according to new nationwide survey data released Tuesday.
Among more than 3,500 patients polled, between 43% to 51% expressed neutral feelings about this technology (depending on the question). However, those with higher electronic health literacy, educational attainment or of a younger age were “significantly” likelier to see AI as a useful aid in medicine, experts wrote Tuesday in JACR.
“These findings suggest that there is opportunity to educate and engage with our breast imaging patients who may be undecided about the implications of AI in radiology practices,” radiologist Brian N. Dontchos, MD, with the University of Washington, Seattle, and co-authors advised. “Addressing concerns that AI may exacerbate preexisting healthcare disparities and biases will be important.”
Researchers distributed the 36-question, printed survey to patients served by eight U.S. radiology practices including six in academia and two more outside of education. Polling closed in October 2024 with 3,532 surveys collected for a response rate of nearly 70%. Participants’ median age was 55, with the sample skewing white (73%) and educated (77% had completed college).
The survey included four questions related to general perceptions of AI in healthcare, and most responded neutrally (43%–51%). When presented with a contrary statement of, “I find using artificial intelligence to perform medical tasks a bad idea,” there was a “balanced preference,” the authors reported. About 28% disagreed with the statement compared to 25% who concurred.
Those with lower electronic health literacy and less education were “significantly” more likely to indicate it was a bad idea for AI to perform medical tasks. Non-white patients also were likelier to express concern that artificial intelligence will not work as well for some minority groups when compared to other survey respondents. Overall, favorable opinions about the technology typically came from individuals who were younger, more educated and had a greater grasp of electronic health tools (as measured by eHEALS).
“Individuals who use and interact with AI in other arenas may be translating these impressions to healthcare applications,” Dontchos and co-authors advised. “In contrast, practices that wish to adopt AI tools may need to develop educational outreach efforts specifically for patients who are less technologically savvy to foster trust and acceptance. Aside from older age, identifying patients who do not access their electronic medical record or schedule appointments online may be a straightforward means of targeting patients for outreach.”
Along with worries about bias, a smaller sample of patients expressed concern about the loss of privacy that might come with imaging AI. These concerns were most common among non-white survey respondents, likely reflecting “racial differences in mistrust of the medical system.”
“To achieve successful implementation, medical providers should understand the rationale of public concerns about AI to focus educational efforts on mitigating mistrust when introducing new far-reaching technology,” the authors wrote, adding that patients who had never spoken to a radiologist were more likely to view healthcare AI as a bad idea. “As integration of AI into radiology practices grows, care delivery will inevitably progress and providing patient education in this evolution will hopefully support compliance and trust (e.g., making it more likely a patient will present for a screening mammogram or for recommended follow-up imaging appointments),” the authors added later.
Read more, including potential study limitations, in the Journal of the American College of Radiology.