Nearly two-thirds of consumers surveyed say they’d trust a diagnosis from AI over a human doctor

Nearly two-thirds of consumers say they’d trust a diagnosis from artificial intelligence over a radiologist or other human doctor, according to new survey data.

However, the older the individual, the less likely he or she is to support the use of AI in healthcare. Across all four generations polled—baby boomers, Generation X, millennials, and Gen Z—potential patients are most comfortable when healthcare AI is used for medical imaging analysis (60%).

The findings are from a survey of 1,027 individuals conducted by Palo Alto, California-based Innerbody Research.

“When we asked about the different ways AI could be used in healthcare, medical imaging analysis was the most accepted application across all generations,” the consumer research firm noted. “In fact, three out of five people reported being comfortable with this aspect of AI involvement in the healthcare system. The survey participants' observations hold merit, as numerous studies have been published demonstrating the ability of deep learning technology to identify cancer in radiology images.”

Besides imaging, consumers said they were most comfortable with AI use in predictive analytics (47%), electronic health record management (46%), health monitoring (45%), virtual nursing assistance (44%) and drug discovery (41%). Only about 3% of those surveyed said they were uncomfortable with AI deployment in any aspect of medicine. When broken down by generation, 82% of Generation Z respondents said they’d likely trust an AI diagnosis over a radiologist’s or other doctor’s decision versus 66% of millennials, 62% of Gen X and 57% of baby boomers. Nearly 7% of the latter age group said they were “not comfortable with any AI in medicine at all,” Innerbody reported.

“The most remarkable takeaway from our survey was that 64% of respondents said they would trust a diagnosis made by AI over a human doctor,” the analysis noted. “This result indicates a significant shift in public perception and trust toward technology,” the authors added later.

Meanwhile, when asked for their top concerns about the use of AI in healthcare, “accuracy of diagnoses” was the No. 1 answer at nearly 54%. Other popular responses included data privacy and security (50%), technical limitations (43%), job loss for healthcare professionals (39%) and “dependence on technology” (38%).

Innerbody also explored the deployment of nanotechnology (microscopic particles used for the prevention and treatment of disease) and robotics in medicine. Again, imaging was the top answer with 55% of respondents comfortable in using nanotech in the specialty, followed by diagnosis (52%) and drug delivery (45%). About 86% of respondents said they would be comfortable using robots to perform an X-ray, 82% for CT and 77% for MRI. On the other end, only 45% were OK with robotics in hip replacement surgery, followed by 46% in heart bypass surgery and 47% in cesarian delivery.

“For medical procedures performed or assisted by robots, all generations found X-rays and CT scans to be in their top three in terms of comfort. It appears that medical imaging feels the least risky to people when it comes to robotic assistance,” the authors noted.

Marty Stempniak

Marty Stempniak has covered healthcare since 2012, with his byline appearing in the American Hospital Association's member magazine, Modern Healthcare and McKnight's. Prior to that, he wrote about village government and local business for his hometown newspaper in Oak Park, Illinois. He won a Peter Lisagor and Gold EXCEL awards in 2017 for his coverage of the opioid epidemic. 

Around the web

After reviewing years of data from its clinic, one institution discovered that issues with implant data integrity frequently put patients at risk. 

Prior to the final proposal’s release, the American College of Radiology reached out to CMS to offer its recommendations on payment rates for five out of the six the new codes.

“Before these CPT codes there was no real acknowledgment of the additional burden borne by the providers who accepted these patients."

Trimed Popup
Trimed Popup