Storage Dilemmas in the MDCT World
Be aware of clinical, customer-service, and medicolegal issues in devising an image-storage solution for MDCT studies—and know that the interpretation tools are in transition
In the world of PACS, multidetector CT (MDCT) is widely regarded as disruptive technology. Explosive growth in the number of slices per study has created challenges in virtually every aspect of PACS, including display, image distribution, and storage. At the same time, MDCT has been hailed as a major breakthrough by those whose goal is to improve the ability of CT scanning to diagnose disease.
In order to address the challenges created by CT scanners that can produce thousands of images during a single study, the scan-thin, read-thick concept has emerged. The idea behind this concept is that the clinical information in an MDCT scan can be extracted by rendering the thin axial dataset into thicker, multiplanar series, occasionally supplemented by advanced visualization such as 3D modeling and specialized analytic software. This volumetric approach to MDCT interpretation has been shown to improve diagnostic accuracy, confidence, and speed.
To address this need, PACS vendors have recently started to integrate 3D tools fully into their standard display protocols. Most radiologists, however, are still forced to divide their attention between their PACS workstations and specialized 3D software. This division has facilitated a natural separation between the temporary storage of thin axial data on a 3D workstation or render server and the long-term storage of thick rendered data and selected 3D images on PACS.
Is this the best model for data storage? As we near the end of the first decade of the 21st century, we need to reexamine this concept in light of steadily improving storage technology and an improved understanding of the utility of thin-section data.
Stratifying the Issues
In order to decide which storage schema best suits your practice, it is important to look beyond the terabytes and gigabits to the clinical factors that can influence your decision. First, while most 3D and PACS vendors (and many radiologists) view 3D software as a single entity, when considering the storage of thin data, it is best to separate advanced visualization into two separate categories: 3D clinical applications (3DCA) and the 3D tool set (3DT). 3D clinical applications are those specialized clinical programs that allow for the interpretation and analysis of specific types of studies, such as cardiac CT, virtual colonoscopy, and brain-perfusion studies. These represent a small (but growing) fraction of most practices’ workloads, are typically interpreted by a subset of radiologists, and are interpreted with software that is technologically immature (that is, the software is rapidly improving).
In contrast, 3DTs are the basic volumetric tools, such as maximum-intensity projections, multiplanar reconstructions, and curved planar reconstructions. A 3DT is used every day by technologists and radiologists to create the thick-slab datasets that are stored on PACS and used for the interpretation of MDCT studies. A 3DT is used, either directly or indirectly, by every radiologist who reads MDCT; it is a mature technology. Whether a 3DT is available on your radiologist’s desktop as client-server software or is integrated into your hanging protocols will be a principal driver of your storage decisions.
Second, in this decade, referring physicians have divided naturally into two groups: report-focused and image-focused clinicians. Report-focused clinicians rely, for their day-to-day practice, on our radiology report, supplemented by access to key images and thick-section data. Image-focused clinicians, while still using the radiologist’s report, frequently need access to the original thin data, along with the 3D software used to re-render the data. Examples include orthopedic surgeons and neurosurgeons, as well as cardiologists. Radiologists require 3DTs to perform interactive volumetric interpretation. Whether these tools only reside in separate 3D software or are available within your PACS hanging protocols, allowing for volumetric interpretation will be a principal driver of your storage decisions.
Third, we must recognize the medicolegal aspects of image storage. Developed in the era of hard-copy imaging and paper medical records, the concept of saving all the images used by the radiologist to interpret the study seems confusing, at best, in our emerging thin-section world. Two examples illustrate why this standard is so troublesome. Practice A sends representative thick-section axial data to PACS and thin data to a temporary archive attached to 3D workstations. A radiologist uses the thin data, volumetrically rendered on a 3D workstation, to make the diagnosis, saving only a few key images to PACS.
Practice B sends only thin data to an advanced PACS, where a radiologist reviews the study using a hanging protocol with rendered thick-slab images, saving a few key images and the original thin data set. In either example, have we met the standard of actually saving the data used to make the diagnosis?
Even more troubling is whether the 3D or PACS vendor could accurately recall which software and which version of software were used by the interpreting radiologist. While the ultimate decision on the medicolegal requirements for image storage in the thin-slice world is still outstanding, it behooves us proactively to recognize its potential to affect our storage decisions by consulting with our hospitals’ legal departments.
Hitting the Reality Wall
As if these clinical factors were not enough, we also need to deal with the dichotomy between the way we perceive technical prowess and the realities of radiology informatics. Read any newspaper and you will learn that modern computer systems and networks are becoming faster, storage is becoming less expensive, and cloud computing offers tantalizing potential, yet the effects of these technologies on medical informatics are unclear. In this era of even more restricted resources, the costs of upgrading hospital networks and specialized computer systems (such as PACS) represent a significant barrier to new technology. In light of these uncertainties, it is helpful to examine some of the technology myths of PACS storage (see table).
Myth 1: Spinning-disk storage is cheap. Although the price of purchasing spinning-disk storage continues to fall, these savings are being overwhelmed by the mounting lifecycle costs of maintaining large storage systems. These include electricity, cooling, space, maintenance personnel, and the need to replace hardware every 5 to 10 years, due both to technological obsolescence and to reliability problems.
Myth 2: Intelligent PACS storage exists. Intelligent storage implies purging unnecessary data and retaining only the data that you really need. Think for a minute about your own desktop or laptop. When was the last time you tried to clean up your files? How many copies of similar documents did you discover? Could you design a program that could automatically remove only those extraneous documents? A similar problem exists for PACS: Performing an intelligent purge requires customizable rules and an awareness of issues such as patients’ deaths and pending lawsuits.
Myth 3: Moore’s Law will save me. While Moore’s Law continues to lead to faster computers, can we expect this to affect our PACS? Again, look to your home computer to understand why progressive enhancements in computer technology are not rapidly implemented in PACS. Are you still using a four-year-old home computer? In addition to the cost of a new computer, you are probably factoring in the time and effort involved in moving your data and software to a new computer. Now, imagine having to deal with this on an enterprise basis (particularly given the effect of the current economy on health care expenditures), and you can quickly see why new hardware isn’t the ready-made solution to the challenge of PACS storage.
Myth 4: Compression will fix everything. Compression clearly speeds the transmission of images by reducing the file size, but every compressed file needs to be decompressed before viewing. While this is not a terrible penalty when decompressing thick sections, if you store compressed thin data, you now need to decompress multiple thin sections in order to render each thick section. This creates a bottleneck at the server level that becomes more severe when you use newer algorithms, such as interframe compression, that require even more thin slices to be decompressed to render each thick image.
Myth 5: Streaming video will solve my network problems. Some vendors are using streaming-video–like techniques or ultrathin clients to eliminate the need to transmit large quantities of thin data. All processing is performed centrally on servers, allowing radiologists to choose their storage schema without concern for data-transmission issues. While this is highly promising over the larger bandwidth found in hospitals, some caution is needed for situations where bandwidth is reduced and latency issues arise, such as in smaller medical offices and at home. Just recall what happens when you try to watch a very popular video from YouTube–or ask your children about frag lag, the popular term for display delay in Internet gaming. Given the resources that these companies can pour into their Internet servers, this approach for medical imaging deserves careful scrutiny by the radiologist consumer.
Myth 6: Faster servers will fix everything. In addition to invoking the issues described under Moore’s Law, this solution has an added challenge: compression and video streaming both place additional peak loading on servers. Will your enterprise install and maintain sufficient server power to handle the time, usually 4 pm on a Friday, when every radiologist and many clinicians want immediate access to patient images?
Myth 7: Clinicians will be satisfied viewing key images. Report-focused clinicians may well be satisfied with key images that highlight the specific pathology described in a radiology report. Two problems exist with this scheme. First, it requires that radiologists assiduously mark key images when they review any study that involves a large number of images. While this is a wonderful way to communicate with report-focused clinicians, my own unpublished survey of 15 academic and community practices demonstrated that barely 50% of cross-sectional studies were indexed with key images. Second, key images will not satisfy image-focused clinicians’ need to view a substantial portion of the entire data set.
Managing the Present
How should we handle this transitory IT world in which we reside? We can see the future, where there are sufficient resources in bandwidth, computing power, and software to handle the tsunami of thin-section and rendered images. At present, however, there remain significant barriers, especially outside the academic setting, to comprehensive storage.
Which storage model is best for your practice? A single large archive that includes thin data improves data management and follow-up, but can bog down your PACS (Figure 1). Using a separate thin-section archive, typically linked to the 3D software, creates challenges for data management and follow-up care (Figure 2). Look first at your 3D workflow. If your PACS allows you to use 3D tools within routine hanging protocols, then you should develop a single PACS archive that includes thin data to maximize the potential of this technology. If 3D is available to you within PACS, via client-server 3D, or on a stand-alone workstation, then sending your thin data to an archive attached to your 3D system makes sense.
If you do choose a separate, temporary archive for your thin-section data, be aware of those patients who will benefit from long-term storage of their thin-section data. One example would be a patient with a small pulmonary nodule who will require serial follow-up. For patients such as these, choices concerning biopsy or surgery will depend on accurate assessment of subtle nodule growth between exams. Selective storage of their thin-section data allows for improved follow-up care without negatively affecting most PACS archives. This blended storage model is a good compromise for most practices. Sending selected thin data to a long-term archive allows for improved patient follow-up care. This model also allows radiologists to store thin data selectively for research or teaching.
As we move into the next decade, technology will continue to improve. Watch for novel thin-section-based rendering and analysis software that can positively affect your patient care. Continue to query your PACS vendors as they improve their integrated 3D functionality and the sophistication of their storage technology. Given the productivity gains for both radiologists and image-focused clinicians that are likely to come from volumetric interpretation on the PACS desktop, this functionality may be a technological advance in which your enterprise is willing to invest. Remember that for your institution, storage extends beyond radiology, so close collaboration with your hospital’s IT group will help you optimize the timing and success of an expansion in your PACS storage scheme.