SIIM 2019 Q&A: Dr. Kahn on Radiological AI and Its Practical Pursuits

Charles Kahn Jr., MD, MS 

The terms “AI” and “machine learning” appear early and often on the agenda for SIIM19, which rolls into Denver the last week of June. The heavy dose of AI-related tech talk is no surprise, given the fast and steady interest in these technologies evidenced by imaging informaticists in medicine. The talks will culminate in the closing general session when Charles Kahn Jr., MD, MS, of Penn Medicine presents this year’s Dwyer Lecture. His topic: “Making Imaging AI Safe, Effective and Humane.” 

What many SIIM19 attendees may not know is that Kahn, whose roles at the University of Pennsylvania include professor and vice chair of radiology, began his scholarly pursuits in theoretical mathematics. He also took a sabbatical from medicine around the turn of the millennium to complete his master’s degree in computer science. This mix surely figured in the thinking of RSNA’s publications division when the organization offered him the first editorship of its new journal Radiology: Artificial Intelligence—and in his own thinking when he accepted. 

“As much as I love the theoretical in mathematics and computer science, my goal in medicine has always been practical,” Kahn told RBJ in an interview during the run-up to SIIM19. “I want to take these things we can do using computers and find a way to best apply them to solve real problems in healthcare—to improve patient outcomes, improve access to care, reduce costs—whatever we can do to make healthcare better for everyone.”

Here’s more from the conversation. 

RBJ: Radiology is a particularly tech-heavy medical specialty, and it’s been using CAD in mammography, for example, for a long time. Is it a given that radiology ought to be out in front, helping to lead all of medicine into the age of AI? 

We do have lots of data. Everybody is out there talking about big data. One of the challenges with healthcare is, yes, there are lots of lab values and diagnosis codes, but much of the meaningful information in electronic medical records is tied up in words—progress notes, surgical notes and radiology reports. Laboratory medicine generates numbers, whereas we generate images. And those images provide a large set of data that can be used to train AI models. I don’t know if I’d say that radiology is necessarily in the lead on AI, but among the areas where AI techniques can be applied, including machine learning, deep neural networks and so on, we are as far along as anybody. 

As far as leading the way in medicine, to my knowledge, Radiology: Artificial Intelligence makes us the first specialty that has its own journal dedicated to AI. Many of the technologies that are being applied to radiological images are also applicable to pathology, ophthalmology, dermatology and other imaging fields. We can learn from the approaches used in those areas and see how they have proved their systems’ effectiveness. Radiology has an opportunity to assert leadership, but that leadership will come from doing good science, of showing how these techniques can be developed and evaluated. I don’t think we can just walk in and tell people to follow our lead. We have to demonstrate our leadership.

And there’s already momentum in that direction.

I think so. There’s tremendous interest in applying these technologies to radiology images. And we have the advantage as well that our images are all digital. In pathology they’re just beginning to move toward digital images, capturing whole-slide images and so forth. In radiology we have the advantage that we’ve been capturing information digitally for more than a decade now. We have large collections of images that we can use to train algorithms for things that people want AI to do.

Your topic for the SIIM Dwyer lecture is making imaging AI safe, effective and humane. How might incautious use of the technology threaten those attributes in real-world settings? 

In all of these things, as with any medical technology, we want to make sure it works. It can never be the case that someone has just claimed that something works. We need to have good ways to evaluate these technologies and prove that they work. We need to show they work in diverse settings, with diverse populations and, in particular, that they work with the population of patients that you see in your community. 

The notion that “I built a system where I work, it seemed to work fine for me, and so then I sold it to you” is faulty. There’s no guarantee that my system is going to work appropriately in your setting. We have to be cautious. There’s been a lot of enthusiasm and excitement about this. We appreciate that, but now we have to back it up with facts. 

One of the particular challenges with AI systems is that they learn things we didn’t even know they were learning. And the only way to discover that is through very careful testing and probing of the AI systems to make sure that, when we think they’re making a diagnosis, they are in fact making that diagnosis. 

We don’t work for AI—AI works for us. This isn’t The Matrix. We humans haven’t been inserted into an environment where we’ve been enslaved by some form of technology. We choose how and when to apply these technologies. This ethos applies to every piece of technology humans have ever invented. Technology can be used to the benefit of humankind or to its destruction. It’s incumbent on all of us who are using AI technology to identify how we will choose to use it. 

What must imaging AI developers, vendors and end users do to guard against the risks of misuse?

Certainly there are efforts underway. The FDA has put in place some standards that these systems have to meet. I think there’s going to be a higher bar that we’ll have to test these things to, and each of us in our own organization has to look out for the risks. If someone walks into our organization and says, “I’ve built a system that finds lung nodules,” our response has to be: “Great. We’ll try it over the next month and see if it actually works with our population.” Is it really detecting clinically important lung nodules? Is it detecting small, calcified nodules that are benign? Or is it really, truly detecting nodules that I should be worried about and following in order to reduce the chance of the patient developing lung cancer?

There are some things we know we can do in advance. Assuring that there’s appropriate diversity in the patient populations that we study, assuring that we have an appropriate mix of cases that mimics what one would see in a population at large—these things are challenging but necessary. Nodules are common while cancers are relatively rare. So you have to have one or two orders of magnitude more patients in your training set than you might otherwise include. People have to be thoughtful about how they build and test these AI systems. 

And of course there are some excellent resources. The ACR has its Data Science Institute with use cases and all sorts of help for AI stakeholders in radiology. 

Yes, and that’s important, because there are a lot of very bright and adept scientists who are working at applying AI in the healthcare arena—but they come into it from a computer science background. They don’t necessarily know, from a clinical standpoint, where to find the biggest bang for the buck. The role of physicians is to guide the computer scientists toward applying the technology to solve the most clinically important problems. Getting people to focus on the things that are really going to make a difference is one of the key ways we as radiologists can contribute so that computer scientists can focus their talents on things that will have the greatest health benefits. 

Can you give an example of how you and your colleagues at Penn Medicine and the Perelman School of Medicine are using AI today?

One of the things we’re doing is creating a tool that identifies and quantifies the burden of disease in patients with multiple sclerosis. These are patients who have a chronic disease and end up getting scanned with MRI repeatedly. They’re on various medications, and the question is whether or not the medication is working. Do we need to change the medication, change the dose? A tool developed by one of our colleagues at Penn Medicine, Michel Bilello, MD, PhD, specifically does comparison measurements of lesions and plaques that accumulate in the brain and spinal cord from MS. The tool can measure and compare the volumes of today versus those of prior scans. 

We’ve been able to use that tool not only to do the measurements but also to reduce the need to administer IV contrast. If we can show on the non-contrast MR images that there’s been no change, we don’t need to use contrast. The study my colleagues did on this found most of these patients will show no change on the standard exam with contrast. (See Mattay et al., “Do All Patients with Multiple Sclerosis Benefit from the Use of Contrast on Serial Follow-Up MR Imaging? A Retrospective Analysis,” American Journal of Neuroradiology, Nov. 2018.) 

So instead of doing a long exam before and after getting IV contrast, the patient gets a much shorter MRI exam without contrast. If the MS detection tool finds no significant change in those plaques on the non-contrast MRI, then the patient’s done. Patients are scanned faster; they don’t have to get a stick in the arm. It can save the healthcare system money, too. Everybody wins. 

With Radiology: Artificial Intelligence up and running—it’s already drawn more than 1,300 followers on Twitter, just to name one quick hint of impact to come—how have you fine-tuned your vision for the journal from what you had in mind when you accepted the editorship?

Much of a journal’s quality depends on the contributions of its authors. I officially started in the role last July, having been named to it the previous spring. The mission I was given with the journal was to provide a venue for the highest possible quality of science, to spotlight work that is current and demonstrates results convincingly. Where we particularly want to stand out is not only demanding technical excellence from our authors but also showing the impact of AI on health outcomes wherever we can. 

It’s going to take some time for that [latter] piece to come. We’re still in the pretty early days of developing AI systems and showing them in practice. We’re just getting started going in that direction, so there’s a lot of work yet to be done. But our mission is to provide the same high-quality science as RSNA’s flagship journal, Radiology. And I’ve been delighted with the quality of the work we’ve published in Radiology: AI so far. Going forward, we seek to publish work that’s in the sweet spot of technical soundness and clinical importance. 

What are you looking forward to seeing, hearing or otherwise learning about at SIIM 2019?

The SIIM annual meeting is a terrific place to catch up on the latest in the application of information technology in medical imaging. I’m excited to see the applications that people are driving forward that relate to machine learning and other technologies in imaging, across the board. Whether it’s related to the image itself, the text of the radiology report or ways to organize information in a radiology department, there’s always a lot of novel work being presented. 

Dave Pearson

Dave P. has worked in journalism, marketing and public relations for more than 30 years, frequently concentrating on hospitals, healthcare technology and Catholic communications. He has also specialized in fundraising communications, ghostwriting for CEOs of local, national and global charities, nonprofits and foundations.

Around the web

The patient, who was being cared for in the ICU, was not accompanied or monitored by nursing staff during his exam, despite being sedated.

The nuclear imaging isotope shortage of molybdenum-99 may be over now that the sidelined reactor is restarting. ASNC's president says PET and new SPECT technologies helped cardiac imaging labs better weather the storm.

CMS has more than doubled the CCTA payment rate from $175 to $357.13. The move, expected to have a significant impact on the utilization of cardiac CT, received immediate praise from imaging specialists.