Catching Up With the Future: The Radiology of Tomorrow
The future is here—it just hasn’t made it to radiology yet. A restless pioneer spirit continues to drive radiology into the future, even if that future is lagging well behind advances achieved by Internet commerce companies. “This is not novel; this is how IT works in every other vertical setting except medicine,” Paul Chang, MD, FSIIM, explains. “We’re 10 years behind the rest of the world. We are arrogant and ignorant—it’s OK to be one, but it’s bad to be both.”
Paul Chang, MD, FSIIM Chang is talking about service-oriented architecture (SOA), a way of creating customized applications by incorporating software in existing vendor systems and linking them to achieve specific purposes. With SOA, he says, the radiologist of the future will be able to access information from various sites so that it all comes together on the workstation. “The big future now is interoperability,” Chang says. “We’ve got the PACS, the RIS, and the electronic medical record (EMR), but they’re all little islands. I don’t want that. I want the best parts of all of them, in an experience optimized for me.” Chang uses the example of ordering from Amazon to explain how SOA works. “When I push the button to order something, Amazon is talking to 20 different systems—UPS, Toys”R”Us, and so on,” he says. “If we are forced to order something in the hospital, we have to log in to all the different vendors. The technology that allows Amazon to contact them simultaneously is SOA. We need the same thing for patient care.” SOA is so important to Chang that he uses it in one of his titles. He is medical director for SOA infrastructure at the University of Chicago Hospitals; he is also professor and vice chair for radiology informatics at the University of Chicago School of Medicine. Chang says that one thing that SOA will bring to the radiologist’s cockpit of the future is the ability to integrate clinical data seamlessly into the radiologist’s workflow as he or she sits at the workstation. “I’m an oncologic body imager,” he says. “It’s important for me to understand the clinical context before I render an interpretation.” He adds, however, that the clinical information that comes with a requisition for imaging is very limited. To get better information, Chang says, the typical radiologist must log into the hospital’s EMR database and search the clinical history under that patient’s name. “It takes time, and the application is designed for the primary care physician; it’s not meant for me,” he says. At the University of Chicago Hospitals, where Chang has already partially implemented SOA, the patient’s medical history from the EMR shows up with a click on the PACS. “SOA allows me to mash up or create any arbitrary application for the user,” Chang says. “That’s my PACS. When I look at my PACS, instead of forcing me to go to other applications, it pops it up the way I want it, in an appropriate, idiosyncratic manner. It makes it look like the PACS pops up laboratory reports.” Broader Efficiency SOA is only part of the delivery system that radiologists will use in the future, Chang says. “Even today, we’re pretty efficient in the reading room,” he says, “but talking about report turnaround or patient throughput is too limited a view. The clock starts when a physician decides an image is needed, and it ends when he or she gets information from that image that pertains to patient care.” If technologists have to struggle with manually composed imaging protocols at the scanner, that creates inefficiency. It adds to the referring physician’s wait for results. “That’s the lack of interoperability,” Chang says. The University of Chicago Hospitals are already experimenting with attaching radiofrequency-emitting identification chips to patients that track them through the care-delivery process, including the radiology suite. As soon as the patient enters the facility, the protocols for the scan can be entered electronically into the scanner, saving time for the technologist and, down the line, the referrer. “We’re ready to deploy that now,” Chang says. “Cutting 10 to 20 minutes off the scan time is huge. Now, the whole complex protocol takes one second. The next theme is to look at the whole cycle.” Chang’s team is also working on automatic prompts that will remind radiologists to call referrers for follow-up for tiny tumors that need to be watched. Another prompt requires emergency-department attending physicians to acknowledge discrepancies that radiologists discover when overreading residents’ preliminary reports, Chang says. This ensures quality control and better patient care. Another improvement in care will come with increased focus on computerized physician order entry systems and the patterns they generate, Chang says. Does the ordering physician realize that the patient has had multiple CT exams? Automatic prompts will work there, too, to mitigate needless or dangerous radiation exposure, Chang suggests. “We have got to demonstrate value,” he says. “What can IT do on this?” Thinking Machines and Decision Support At the Stanford University School of Medicine in California, Daniel Rubin, MD, MS, is researching other ways for the radiologists of the future to demonstrate the added value that Chang says will be demanded. “The main difficulty with radiology workstations now is that they’re not very intelligent,” Rubin says. “They allow radiologists to find a patient’s images, but there’s no intelligence that guides interpretation.”
Daniel Rubin, MD, MS Systems are under development, Rubin says, that will allow the radiologist of the future to integrate knowledge bases into the interpretative process, all while remaining at the workstation. The key here is what Rubin calls content-based image retrieval. Instead of descriptors being used to launch queries, the image itself will be the query. “Imagine a paradigm where the radiologist is looking at an unusual-appearing mass in the liver and doesn’t know what it is,” Rubin says. The current practice is either to ask a colleague or to go to a book and look up suspected diseases. Rubin adds, “If radiologists don’t think of the disease that it might actually be, they might not find the matching lesion. We can use the image as a query for the database of images for which the diagnoses are known, retrieve similar images, and see what diagnoses are associated with them,” he continues. “This can guide radiologists’ decision making as to what the diagnosis might be.” Rubin says that commercial photographers and editors are already using images as query prompts when setting up photography sessions. He says that the problem for radiology is more complex because a radiologist wants to query just that part of the image containing the lesion; nonetheless, the problem can be solved, and its solution will greatly enhance decision support. Rubin, who is an assistant professor of radiology at Stanford, is also head of the school’s Laboratory of Imaging Informatics. The laboratory is researching image-query decision support, Rubin says, adding that preliminary results have been encouraging. “I’m very optimistic that in a five-year time frame, we’ll at least have some sort of basic image-search functionality that will be useful to radiologists,” he says. When image-query decision-support tools become available, they will not only improve the individual radiologist’s performance, but will lessen the distinction between expert and nonexpert and, at least in terms of decision making, reduce variations in practice between radiologists, Rubin says. “Radiologists will have more and more benchmarks for performance,” he adds. Rubin foresees the development of semantic systems and knowledge bases (ontologies) that will allow much more of the interpretative process to be machine assisted. Radiologists at their workstations will have access to all kinds of diagnostic and clinical information, so that “modeling the relationship between the radiologist’s observations and the diseases can extend to clinical data or molecular data,” Rubin says. If the probability of malignancy from what is seen in an image is 10% according to diagnostic modeling data, and if 2% probability has been established as the cutoff for recommending biopsy, then the decision to perform biopsy is clear, and variations in radiology practice can be reduced, Rubin says. Rubin is also researching better ways to computerize annotations and other marks on images so that radiologists will be able to improve cancer staging, over time, through the automatic generation of tumor measurements and associated graphics. These will appear on their workstations along with patients’ latest images, Rubin says. The Cockpit of the Future How quickly can the cockpit of the future (the highly sophisticated, integrated workstation at which tomorrow’s radiologist will sit) be developed? First, Rubin says, the arduous but ongoing work of developing disease models, image-based queries, lexicons, and ontologies will have to be completed. New standards will have to be written that will bring uniformity to the machine processes handling all the tasks. How quickly this happens, Rubin says, will depend on how adamant radiologists are in demanding the new tools. “The technology is here or is being developed,” he says. “What’s really going to light the fire under vendors is radiologists’ awareness of these capabilities and their pushing of vendors to incorporate them. It will take radiologists saying, ‘It’s not efficient to have five different systems. I need it all integrated.’” Rubin and Chang aren’t the only ones who foresee—and are planning for—an integrated radiology of the future based on interoperability. Eliot Siegel, MD, FACR, FSIIM, says that vendors are already scrambling to introduce technology that will meet interoperability demands. “At night, when only one or two radiologists for emergency interpretations are on duty for several sites, they have to tie things together,” Siegel says. “The single biggest issue in radiology is workflow from multiple enterprises when all are involved. Everybody wants one seamless workflow. I don’t think vendors will fight this. They are all coming out with multiple-enterprise workflow engines.”
Eliot Siegel, MD, FACR, FSIIM Siegel, who is professor and vice chair of the department of diagnostic radiology at the University of Maryland School of Medicine in Baltimore, says that multiple enterprise integration can’t arrive soon enough. “We cover for many hospitals, and in many cases, we have to go from one workstation to another because images are on vendor-specific systems,” he says. “In the future, we will be reading from anywhere.” In fact, Siegel, in another role, is already using a futuristic reading room. In addition to his position at the University of Maryland, he is chief of radiology and nuclear medicine for the Veterans Affairs (VA) Maryland Healthcare System in Baltimore. In 1993, the VA allowed Siegel to configure the world’s first filmless radiology department. More recently, it has helped him configure a radiology reading room of the future, where he now works. It’s an office where LED technology allows him to turn the outer wall from clear to opaque glass for privacy. He dictates reports on a voice-recognition system and interprets at a three-monitor workstation. “I sit in an ergonomic chair at an ergonomic desktop,” he says. “I also have warm air blowing on me. There is sound-controlled acoustic damping on the walls—an active sound-masking system that is tailored to the human voice, but that actually works better to mask MRI scanning sounds.” Siegel says that the VA reading room draws visitors from around the world, but it’s only a start. He foresees touchscreen technology and the use of sound/color prompts to announce events such as stat interpretations. Like Rubin, he also sees a future populated by digitally stored annotations and huge clinical databases for decision support. There will be dashboards that highlight abnormal scans. Summaries of patient-specific clinical information and wizards or templates to standardize reporting will one day be commonplace. “I see us having much more genomic and biomarker information,” he says, “and the biomarkers will be on the molecular level. It will be highly personalized medicine.” These changes won’t be taking place in radiology alone, Siegel says. Oncologists, cardiologists, and primary care physicians will have highly developed workstations that meet their needs and integrate with other subspecialty workstations as well. “I’m not sure if we’re talking about 5, 10, or 15 years from now,” Siegel says, “but these are things for which there are either prototypes or development projects now.” George Wiley is a contributing writer for ImagingBiz.com.
Paul Chang, MD, FSIIM Chang is talking about service-oriented architecture (SOA), a way of creating customized applications by incorporating software in existing vendor systems and linking them to achieve specific purposes. With SOA, he says, the radiologist of the future will be able to access information from various sites so that it all comes together on the workstation. “The big future now is interoperability,” Chang says. “We’ve got the PACS, the RIS, and the electronic medical record (EMR), but they’re all little islands. I don’t want that. I want the best parts of all of them, in an experience optimized for me.” Chang uses the example of ordering from Amazon to explain how SOA works. “When I push the button to order something, Amazon is talking to 20 different systems—UPS, Toys”R”Us, and so on,” he says. “If we are forced to order something in the hospital, we have to log in to all the different vendors. The technology that allows Amazon to contact them simultaneously is SOA. We need the same thing for patient care.” SOA is so important to Chang that he uses it in one of his titles. He is medical director for SOA infrastructure at the University of Chicago Hospitals; he is also professor and vice chair for radiology informatics at the University of Chicago School of Medicine. Chang says that one thing that SOA will bring to the radiologist’s cockpit of the future is the ability to integrate clinical data seamlessly into the radiologist’s workflow as he or she sits at the workstation. “I’m an oncologic body imager,” he says. “It’s important for me to understand the clinical context before I render an interpretation.” He adds, however, that the clinical information that comes with a requisition for imaging is very limited. To get better information, Chang says, the typical radiologist must log into the hospital’s EMR database and search the clinical history under that patient’s name. “It takes time, and the application is designed for the primary care physician; it’s not meant for me,” he says. At the University of Chicago Hospitals, where Chang has already partially implemented SOA, the patient’s medical history from the EMR shows up with a click on the PACS. “SOA allows me to mash up or create any arbitrary application for the user,” Chang says. “That’s my PACS. When I look at my PACS, instead of forcing me to go to other applications, it pops it up the way I want it, in an appropriate, idiosyncratic manner. It makes it look like the PACS pops up laboratory reports.” Broader Efficiency SOA is only part of the delivery system that radiologists will use in the future, Chang says. “Even today, we’re pretty efficient in the reading room,” he says, “but talking about report turnaround or patient throughput is too limited a view. The clock starts when a physician decides an image is needed, and it ends when he or she gets information from that image that pertains to patient care.” If technologists have to struggle with manually composed imaging protocols at the scanner, that creates inefficiency. It adds to the referring physician’s wait for results. “That’s the lack of interoperability,” Chang says. The University of Chicago Hospitals are already experimenting with attaching radiofrequency-emitting identification chips to patients that track them through the care-delivery process, including the radiology suite. As soon as the patient enters the facility, the protocols for the scan can be entered electronically into the scanner, saving time for the technologist and, down the line, the referrer. “We’re ready to deploy that now,” Chang says. “Cutting 10 to 20 minutes off the scan time is huge. Now, the whole complex protocol takes one second. The next theme is to look at the whole cycle.” Chang’s team is also working on automatic prompts that will remind radiologists to call referrers for follow-up for tiny tumors that need to be watched. Another prompt requires emergency-department attending physicians to acknowledge discrepancies that radiologists discover when overreading residents’ preliminary reports, Chang says. This ensures quality control and better patient care. Another improvement in care will come with increased focus on computerized physician order entry systems and the patterns they generate, Chang says. Does the ordering physician realize that the patient has had multiple CT exams? Automatic prompts will work there, too, to mitigate needless or dangerous radiation exposure, Chang suggests. “We have got to demonstrate value,” he says. “What can IT do on this?” Thinking Machines and Decision Support At the Stanford University School of Medicine in California, Daniel Rubin, MD, MS, is researching other ways for the radiologists of the future to demonstrate the added value that Chang says will be demanded. “The main difficulty with radiology workstations now is that they’re not very intelligent,” Rubin says. “They allow radiologists to find a patient’s images, but there’s no intelligence that guides interpretation.”
Daniel Rubin, MD, MS Systems are under development, Rubin says, that will allow the radiologist of the future to integrate knowledge bases into the interpretative process, all while remaining at the workstation. The key here is what Rubin calls content-based image retrieval. Instead of descriptors being used to launch queries, the image itself will be the query. “Imagine a paradigm where the radiologist is looking at an unusual-appearing mass in the liver and doesn’t know what it is,” Rubin says. The current practice is either to ask a colleague or to go to a book and look up suspected diseases. Rubin adds, “If radiologists don’t think of the disease that it might actually be, they might not find the matching lesion. We can use the image as a query for the database of images for which the diagnoses are known, retrieve similar images, and see what diagnoses are associated with them,” he continues. “This can guide radiologists’ decision making as to what the diagnosis might be.” Rubin says that commercial photographers and editors are already using images as query prompts when setting up photography sessions. He says that the problem for radiology is more complex because a radiologist wants to query just that part of the image containing the lesion; nonetheless, the problem can be solved, and its solution will greatly enhance decision support. Rubin, who is an assistant professor of radiology at Stanford, is also head of the school’s Laboratory of Imaging Informatics. The laboratory is researching image-query decision support, Rubin says, adding that preliminary results have been encouraging. “I’m very optimistic that in a five-year time frame, we’ll at least have some sort of basic image-search functionality that will be useful to radiologists,” he says. When image-query decision-support tools become available, they will not only improve the individual radiologist’s performance, but will lessen the distinction between expert and nonexpert and, at least in terms of decision making, reduce variations in practice between radiologists, Rubin says. “Radiologists will have more and more benchmarks for performance,” he adds. Rubin foresees the development of semantic systems and knowledge bases (ontologies) that will allow much more of the interpretative process to be machine assisted. Radiologists at their workstations will have access to all kinds of diagnostic and clinical information, so that “modeling the relationship between the radiologist’s observations and the diseases can extend to clinical data or molecular data,” Rubin says. If the probability of malignancy from what is seen in an image is 10% according to diagnostic modeling data, and if 2% probability has been established as the cutoff for recommending biopsy, then the decision to perform biopsy is clear, and variations in radiology practice can be reduced, Rubin says. Rubin is also researching better ways to computerize annotations and other marks on images so that radiologists will be able to improve cancer staging, over time, through the automatic generation of tumor measurements and associated graphics. These will appear on their workstations along with patients’ latest images, Rubin says. The Cockpit of the Future How quickly can the cockpit of the future (the highly sophisticated, integrated workstation at which tomorrow’s radiologist will sit) be developed? First, Rubin says, the arduous but ongoing work of developing disease models, image-based queries, lexicons, and ontologies will have to be completed. New standards will have to be written that will bring uniformity to the machine processes handling all the tasks. How quickly this happens, Rubin says, will depend on how adamant radiologists are in demanding the new tools. “The technology is here or is being developed,” he says. “What’s really going to light the fire under vendors is radiologists’ awareness of these capabilities and their pushing of vendors to incorporate them. It will take radiologists saying, ‘It’s not efficient to have five different systems. I need it all integrated.’” Rubin and Chang aren’t the only ones who foresee—and are planning for—an integrated radiology of the future based on interoperability. Eliot Siegel, MD, FACR, FSIIM, says that vendors are already scrambling to introduce technology that will meet interoperability demands. “At night, when only one or two radiologists for emergency interpretations are on duty for several sites, they have to tie things together,” Siegel says. “The single biggest issue in radiology is workflow from multiple enterprises when all are involved. Everybody wants one seamless workflow. I don’t think vendors will fight this. They are all coming out with multiple-enterprise workflow engines.”
Eliot Siegel, MD, FACR, FSIIM Siegel, who is professor and vice chair of the department of diagnostic radiology at the University of Maryland School of Medicine in Baltimore, says that multiple enterprise integration can’t arrive soon enough. “We cover for many hospitals, and in many cases, we have to go from one workstation to another because images are on vendor-specific systems,” he says. “In the future, we will be reading from anywhere.” In fact, Siegel, in another role, is already using a futuristic reading room. In addition to his position at the University of Maryland, he is chief of radiology and nuclear medicine for the Veterans Affairs (VA) Maryland Healthcare System in Baltimore. In 1993, the VA allowed Siegel to configure the world’s first filmless radiology department. More recently, it has helped him configure a radiology reading room of the future, where he now works. It’s an office where LED technology allows him to turn the outer wall from clear to opaque glass for privacy. He dictates reports on a voice-recognition system and interprets at a three-monitor workstation. “I sit in an ergonomic chair at an ergonomic desktop,” he says. “I also have warm air blowing on me. There is sound-controlled acoustic damping on the walls—an active sound-masking system that is tailored to the human voice, but that actually works better to mask MRI scanning sounds.” Siegel says that the VA reading room draws visitors from around the world, but it’s only a start. He foresees touchscreen technology and the use of sound/color prompts to announce events such as stat interpretations. Like Rubin, he also sees a future populated by digitally stored annotations and huge clinical databases for decision support. There will be dashboards that highlight abnormal scans. Summaries of patient-specific clinical information and wizards or templates to standardize reporting will one day be commonplace. “I see us having much more genomic and biomarker information,” he says, “and the biomarkers will be on the molecular level. It will be highly personalized medicine.” These changes won’t be taking place in radiology alone, Siegel says. Oncologists, cardiologists, and primary care physicians will have highly developed workstations that meet their needs and integrate with other subspecialty workstations as well. “I’m not sure if we’re talking about 5, 10, or 15 years from now,” Siegel says, “but these are things for which there are either prototypes or development projects now.” George Wiley is a contributing writer for ImagingBiz.com.