The Future Is Now: Massachusetts General Hospital Embraces Deep Learning

Deep learning, artificial intelligence (AI) and automation are gaining more and more momentum in radiology. While some physicians are slow to embrace this vision and trend, fretting over their own job security, others in the industry are inspired by the endless possibilities. The long-term vision positions AI at the center of momentous change in radiology while also pushing the practice of medicine, disease management and physician efficiency forward at a rapid pace.

Massachusetts General Hospital (MGH) in Boston has clearly chosen the quest for deeper learning and thinking and the benefits AI can bring. Last year, MGH made history by becoming the first medical facility in the world to install an NVIDIA DGX-1 AI supercomputer, “the world’s first purpose-built system for deep learning and AI accelerated analytics, delivering performance equal to 250 conventional servers.”

While the supercomputer is roughly the size of a piece of vintage stereo equipment, MGH believes it has unlimited potential, proving that big things do indeed come in small packages.

“It’s almost like thinking that someone has just deployed electricity, so you’re back in the days of Thomas Edison and now electricity is available,” says Keith Dreyer, DO, PhD, vice chairman and associate professor of radiology at MGH and Harvard Medical School. “You go, ‘Let’s see, we used to scrub clothes by hands, but now we can use a motor and a crank and make a washing machine. Or we could make an air conditioner. Or a television. This really is going to change everything.”

Enabling Radiology and Radiologists

James Brink, MD, head of radiology at MGH, says acquiring the supercomputer was the final step of a long process. Researchers from the hospital’s radiology department first approached Brink about machine learning in 2015, and the MGH Center for Clinical Science was eventually established within the department.

"Having radiologists be directly involved in this research is so critical because we know the best use cases that are going to be enabling for radiology and radiologists to improve the efficiency of what we do,” Brink says.

NVIDIA delivered the supercomputer to MGH’s Ether Dome, named for being where anesthetic was first used in surgery back in 1846, and the Center for Clinical Data Science and its research team immediately began investigating what this new technology could do. The plan includes testing and implementing news ways to improve the detection, diagnosis, treatment and management of disease by training a deep neural network using MGH’s database of approximately 10 billion medical images. To process that large amount of data, researchers will utilize algorithms crafted by both MGH data scientists and NVIDIA engineers.

One of the first ways this technology could make an impact on radiology, Brink and Dreyer agree, is by helping specialists with quantification.

“One thing radiologists typically struggle with is measuring tumor and lymph node sizes and tracking those measurements over time,” Brink says. “Having tools that help with that quantification—which, to date, have been somewhat elusive—would really enable robust tracking over time.”

Dreyer, who serves as the executive director of the Center for Clinical Data Science, adds that there are two categories of quantification that will be affected: the quantification humans already do and the quantification of things humans don’t do currently because it is too difficult or takes up too much time.

“If a patient had lymphoma and I was looking at images before and after treatment and I wanted to say what effect the treatment had, imagine the effort it would take to go through every lymph node and measure the exact size down to the cubic millimeter,” Dreyer says. “One could imagine a system using artificial intelligence that could do something like that for us.”

MGH also sees potential for improvements in workflow.

“Other use cases may be in prioritizing worklists—having machine learning tools that identify cases that might have a pneumothorax or might have an abscess,” Brink says. “We could move those types of cases to the front of a worklist for review and evaluation by a trained radiologist.”

As Dreyer points out, MGH’s radiologists read more than 2,000 cases a day, “so it would be nice to be able to know which cases have the highest priority—not just the ones that came out of the scanner first or were at the top of my reading list, but the ones that had some pathology there.”

Deep learning and AI could also greatly improve the patient experience, reducing the amount of time it takes for them to receive care. Dreyer also envisions a day when these technologies can help cut down on callbacks.

“You could even imagine a time when I’m protocoling an MR and I see a lesion I wasn’t expecting to see,” he says. “Rather than having the patient come back, I could have AI see that image and then modify the pulse sequences to improve the tissue characterization for the radiologist.”

Man vs. Machine?

Advances in AI are worrying some physicians who think their jobs may be in jeopardy. “If that computer can be trained to do my job,” they might say, “will I be out of work?”

Dreyer says specialists have nothing to worry about. Instead of replacing radiologists, AI can improve their day-to-day lives and help them spend more time seeing patients and making diagnoses.

“The technology will really aid what it is radiologists deliver,” Dreyer says. “It will improve the areas of our job that are difficult or tedious.”

Brink agrees with Dreyer, adding that researchers are going to explore the potential of AI with or without the help of radiologists, so it makes sense to be a part of that conversation and be heard.

“Some might say, ‘why are you pursuing these things if they may alter the landscape?’” he says. “The answer is, if we’re not the ones defining use cases and ensuring that these technologies can improve and enhance the work we do, those around us are going to do that for us and could potentially ask the wrong questions, damage the work we do or detract from the work we do.”

Dreyer has also encountered a certain amount of skepticism in the industry. Some radiologists do not necessarily worry about AI taking their jobs, but they do doubt that the technology could make a significant impact. He says it is easy to find proof that AI can truly affect radiology—just take out your smartphone and search your photos for keywords such as “tree” or “mountain.” The phone can automatically pull up any photos that match that description, even though the images were never labeled. It has been programmed to “see” those things and identify them on its own, with no assistance from the user.

“If computer systems have advanced so rapidly in the last few years that they can identify things in images they were never capable of seeing before, you have to be able to extrapolate that and think they could find things in medical images too,” he says.

Michael Walter
Michael Walter, Managing Editor

Michael has more than 18 years of experience as a professional writer and editor. He has written at length about cardiology, radiology, artificial intelligence and other key healthcare topics.

Around the web

The nuclear imaging isotope shortage of molybdenum-99 may be over now that the sidelined reactor is restarting. ASNC's president says PET and new SPECT technologies helped cardiac imaging labs better weather the storm.

CMS has more than doubled the CCTA payment rate from $175 to $357.13. The move, expected to have a significant impact on the utilization of cardiac CT, received immediate praise from imaging specialists.

The all-in-one Omni Legend PET/CT scanner is now being manufactured in a new production facility in Waukesha, Wisconsin.