Q&A: Keith Dreyer on radiology’s evolving relationship with AI

Few radiologists understand the relationship between radiology and artificial intelligence (AI) quite like Keith Dreyer, DO, PhD, vice chairman and associate professor of radiology at Massachusetts General Hospital in Boston. He also serves as the chief science officer of the American College of Radiology (ACR) Data Science Institute (DSI) and has spoken at numerous industry conferences about how AI, machine learning and deep learning technologies will change patient care forever.

Dreyer spoke with Radiology Business about radiology’s early response to AI and much more, why machines will never truly replace radiologists and much more. The full conversation can be read below:

Radiology Business: We’re starting to see more and more AI-related solutions gain FDA approval, including some specifically designed to analyze imaging results. Does this progress mean radiologists are a lot closer to experiencing the benefits of AI firsthand?

Keith Dreyer, DO, PhD: Movement in AI is still fairly slow. You still need to have access to data, have the ability to curate data, find a specific problem to solve, obtain high-performance computing and validate your data before you go through regulatory approval. And then once you gain that approval, you still have to market your solution and connect it into clinical workflow. We’re seeing a lot of things gain FDA approval, but that’s still far from full implementation.

There are also still some broad challenges in the market for vendors working in AI. Things like defining the exact use cases, determining what people will pay for these solutions and working out the full business model.

What are your thoughts on how radiology has responded to AI so far?

The response has been broad, but with little harmonization, at least initially. Each supplier looks at this through their own lens—they say, “I have a PACS, so I’m going to make a PACS solution,” or, “I work in visualization, so I’m going to make a visualization tool.” And that’s fine, because AI works by improving a particular tool or function. But the challenge in medical imaging is handling all of these different perspectives and making it so they can come together. That’s one of the reasons the ACR started the DSI, to help bring these perspectives together through common standards and definitions of AI use cases.

One example would be if someone created a pulmonary nodule detector and then someone else created another pulmonary nodule detector. There’s no reason why those two shouldn’t output the same values, the same numbers, for the same examination. So if, say, a patient moves from one location to another location, they might be getting different numbers. The ACR is looking at ways to define these use cases much more consistently so that the creation of AI applications will be consistent. We would still allow room to evolve and innovate, but the inputs and outputs need to be consistent for manufacturers of other solutions to be able to integrate them.

What’s the most common misconception out there about AI in radiology?

For those who are uneducated in this space, the kneejerk reaction is that it’s going to come down to radiologists vs. AI. People frequently want to know when the day will come that radiologists are replaced by AI, and that’s just the wrong question. It’s like if, when calculators first came out, someone had asked when they would replace accountants. It just does not work that way.

I don’t see a day when radiologists are out of practice and it’s all replaced by computers. But radiologists and AI will be better than radiologists without AI.

What are some potential uses for AI in radiology that aren’t getting the attention they deserve? Do you think AI will be able to do anything that might surprise radiologists?

The low-hanging fruit are the things that radiologists do at the time of interpretation—things such as finding lesions, especially in a screening capacity. But AI could also improve workflow or improve the quality of the images themselves by improving the equipment’s settings and acquiring an image again but with different parameters if necessary. AI could even alert technologists when a certain unexpected finding is detected to enhance its characterization.

Fast forward another five or 10 years into the future. How will the day-to-day lives of radiologists be changed as a result of AI and machine learning?

In an ideal setting, some of the work radiologists can’t perform alone—detailed characterization—or the work that is just too tedious—detailed quantification—could be done by AI. Those might be tasks that radiologists do today, but not optimally, or it could be tasks that just aren’t done by humans alone. And some of that information could go straight to the electronic health record (EHR), like how blood work values go straight to the EHR right now.

Radiologists would also have additional information to look at—they would have the image data, but they’d also see the augmented additional data that helps them make a diagnosis. Saving time but also enhancing the diagnosis will be key. It would ideally end up giving radiologists more time to work with physicians and speak to patients.

Michael Walter
Michael Walter, Managing Editor

Michael has more than 18 years of experience as a professional writer and editor. He has written at length about cardiology, radiology, artificial intelligence and other key healthcare topics.

Around the web

The patient, who was being cared for in the ICU, was not accompanied or monitored by nursing staff during his exam, despite being sedated.

The nuclear imaging isotope shortage of molybdenum-99 may be over now that the sidelined reactor is restarting. ASNC's president says PET and new SPECT technologies helped cardiac imaging labs better weather the storm.

CMS has more than doubled the CCTA payment rate from $175 to $357.13. The move, expected to have a significant impact on the utilization of cardiac CT, received immediate praise from imaging specialists.