Dashboards: From Data to Discovery

Paul J. Chang, MD, FSIIM, says, “Because of the external expectations that we will all do more in radiology with less time and fewer resources, we are now entering a maturation phase that I call image management. The emphasis, now, is on understanding what we do to help the value proposition. The key is now measurable improvement in efficiency, productivity, or whatever key performance indicators create value for your institution.” Chang, an abdominal radiologist, is professor and vice chair of radiology informatics and medical director of pathology informatics at the University of Chicago School of Medicine and is medical director of enterprise imaging and of service-oriented architecture at the University of Chicago Hospitals in Illinois. He presented “Technical Aspects: Developing and Deploying a Dashboard” as part of the multisession course “Quality Improvement: Quality and Productivity Dashboards” on November 29, 2011, at the annual meeting of the RSNA in Chicago. The PACS, RIS, and electronic medical record (EMR) were originally designed to let radiology departments do their work, not assess their work, and that is still their primary function. For this reason, Chang says, there will usually need to be a separate business-intelligence entity (Figure 1), deployed on a service-oriented architecture and constructed to bring together the information that a dashboard will then show. The adoption of standards for information systems can only be seen as a huge improvement over the proprietary protocols that came before them (and made communication between systems profoundly difficult, absurdly expensive, or simply impossible). There are still communication problems among information system today, however—along with far better ways of overcoming them. Watching the Battle The hospital information system (HIS) typically uses the HL7 communication standard, for instance, while the PACS is likely to support at least one form of the DICOM standard. Once the EMR, billing/financial systems, any relevant stand-alone pathology and laboratory systems, and the RIS are added—as they must be, to obtain a comprehensive picture of the department’s operations—it is easy to see why manual data aggregation became the norm in many organizations, if they tried to bring together their data at all. Even today, guessing (whether educated or less so) based on the output of the RIS alone is not an uncommon management method in radiology. Of course, manual data-aggregation methods produced information of relatively low utility (at high cost), and never in real time. Business-intelligence systems replace manual aggregation by pulling relevant data from all available sources in the organization. “The critical, absolutely essential tool for us to have to navigate through this environment is business intelligence/analytics,” Chang says. “Health care is about 10 years behind every other industry in IT, but it’s closer to 15 years behind in business intelligence.” Chang adds, “It is useful to distinguish dashboards from scorecards,” despite the fact that the two are the forms of business intelligence/analytics most likely to be seen in health-care settings. Although both scorecards and dashboards can be built using similar data from the organization’s information systems, Chang explains, “The dashboard is a performance-monitoring tool similar to a pilot’s heads-up display: It’s a tactical, real-time, operational tool typically achieved using graphics, charts, and gauges.” The dashboard, he adds, provides tactical situational awareness. It tells users whether they are winning the battle; scorecards (or report cards), in contrast, are strategic, rather than tactical. They tell users whether they are winning the war. The business-intelligence layer is necessary because it is dangerous, Chang explains, to consume naked data. Without integration and analysis, data sourced from a single information system might not be relevant because it is not comparable to data acquired, handled, defined, changed, or retrieved (according to different criteria) from another system. The solutions to this problem of comparability and compatibility are data standardization and normalization. Winning the War The data normalization performed using business-intelligence systems makes large tables into smaller entities in which an added, changed, or deleted field is less likely to affect other areas adversely and is more likely to maintain its actual meaning as it propagates through the information system.

image
Figure 1. Moving away from human-integrated workflow (top) by using HL7 and DICOM protocols to integrate RIS and PACS (middle) and by deploying service-oriented achitecture throughout the enterprise (bottom); image adapted with permission of Paul J. Chang, MD.

Relationships among various kinds of data are also more tightly defined than they would be in a single information system before normalization. “Normalize the data so that you can unambiguously use and trust the information,” Chang says. The development and implementation of a useful dashboard in radiology should not focus on the final appearance of the dashboard itself, Chang says, although that is the misguided emphasis of many dashboard projects and their developers. The appearance of the tool is far less important than the reliability of the underlying data and the agility of the architecture supporting the dashboard. As Chang puts it, garbage in, garbage out is still a sequence that imperils information systems. This is true even (or perhaps especially) when the risk is one of garbage in, pretty garbage out, as it is for dashboards. An attractive dashboard with a user-friendly interface might still have only a tenuous relationship with the reality of the radiology department if the methods used to generate that dashboard are unsound. Bringing together all of the available information in a reliable way, however, constitutes leverage for the radiology department in exactly the same way that making full use of equipment and staff expertise does. Make the most of what you already have by using it as well as you can. Chang notes, however, that in using what you have as well as you can, time is a factor. It is important to favor today’s action over tomorrow’s possible perfection. He says, “The biggest risk you have right now is delay—saying ‘We have to do an analysis’ or ‘Let’s think about this.’ Leverage your local resources, as well as external resources (as necessary).” The real risk in business intelligence/analytics, he says, is not having it at all. Choosing Performance Indicators C. Daniel Johnson, MD, is chair of the department of radiology and professor of radiology in the College of Medicine at the Mayo Clinic Arizona in Scottsdale. In addition to moderating the three-session course, he presented the second section, “Key Performance Indicators That Drive Quality.”

image
Figure 2. The Mayo Clinic approach to safety and quality assesses critical points (red) where imaging interacts with the patient pathway via referring physicians, radiology departments, and radiologists; image adapted with permission of C. Daniel Johnson, MD.

Dashboards are built from the bottom up, so the key performance indicators on which they stand must be both relevant to the department’s operation and feasible to construct using data available from existing information systems (the RIS, PACS, HIS, EMR, departmental, and billing/financial systems). For those reasons, Johnson says, the possible key performance indicators used in dashboards should be narrowed down to those that are most likely to be useful in practice; what it’s nice to know is not the same as what it’s necessary to know. “We really don’t have standardized ways of defining quality,” he says. “In fact, the very best quality measures are probably customized.” Established performance measures in health care can serve as a starting point in the creation of dashboards, however—although they will need to be fleshed out in ways that reflect the actual situation of the organization. These measures are used to predict future performance, but they are also the basis for tracking daily activities (and whether those activities represent progress toward the long-term objectives of the organization). Because the administrative/financial areas of many institutions have been using analytical methods longer than most clinical departments have used them, it might seem easier to begin with billing/financial indicators in setting up business intelligence for radiology. According to Johnson, however, those indicators should certainly not be given priority when the importance of various measures is assessed; in fact, a separate financial dashboard is used at the Mayo Clinic to keep quality and finances apart. Johnson says, “We underwent an analysis by the Baldridge Performance Excellence Program in the radiology department at Mayo Clinic Rochester in Minnesota several years ago. We were astonished to learn that the best metrics we had were financial, and what was really important in creating great patient care was not measured very well.” To manage the quality of work, he adds, departments need measurements that are at least as rigorous as those used for financial attributes. “Begin by defining some key process measures for quality. Safety is probably the easiest one to begin with; efficiency is always something that people are interested in because it reflects dollars saved. Satisfaction isn’t too hard, but professional outcomes are the most difficult to measure well,” Johnson says. The definition of quality is not static, and it needs to reflect the characteristics of the department—as well as those of its customers and its broader environment, both now and as those characteristics change, Johnson notes. Quality and safety can be seen, in a sense, as the points where care providers meet patients; the Mayo Clinic approach to imaging’s interaction with the patient pathway (Figure 2), as it affects safety and the quality of care, was first illustrated and described by Swensen and Johnson1 in 2005. Among many other possibilities, some of the main measures outside the financial arena that Johnson names as examples of helpful indicators are exam access and finalization times (in the efficiency category); waiting times and survey results (in the customer-satisfaction category); and interpretation accuracy, complication rates, and order appropriateness (in the professional-outcomes category). The safety category is likely to be among the largest shown on a dashboard, with common indicators including harmful falls, medication errors, sentinel events, critical tests/results, adherence to universal precautions, infection rates, hand hygiene, contrast-induced nephropathy incidence, and mislabeling of images. Johnson says, “Be as specific as you can with all these measures because the more specific you are, the more directly they will translate into improvement.” In determining patient satisfaction, Johnson says, it is especially important to understand that patients’ needs and expectations will very widely. The survey responses of oncology patients can be expected to differ from those of breast-imaging patients or the parents of pediatric patients, so survey data will have more meaning if the individual respondent is first asked to define what is important in his or her choice of an imaging facility. Then, that person can be surveyed as to whether those personally important criteria are being met—not on unrelated matters, Johnson reports. Another type of customer, he continues, is the referring physician. No matter what preferences this physician might have concerning consultation, interpretation accuracy, or report-turnaround times, it is easy to assess whether those need are being met with just one question. That question, Johnson says, is “How likely are you to refer a family member or close friend to our institution for radiologic care?” While the answer picks up none of what might be relevant detail, it captures the core of any definition of referrer satisfaction. It will not, however, automatically become part of the business-intelligence database. Johnson says, “If you don’t have access to world-class IT support, this process certainly requires a lot of manual work.” Many types of data used in the Mayo Clinic scorecard/dashboard are still entered manually because they are not routinely collected by the RIS, PACS, HIS, or EMR and cannot be integrated automatically by a business-intelligence system. Survey results, Johnson says, are an example of measurements calling for manual data entry. Other manually compiled indicators might come from external databases (such as national disease registries) for which electronic access is available, but from which information is not always distributed electronically. In constructing a scorecard/dashboard, it is important to remember, Johnson says, “Metrics are only one piece of the puzzle. A high-performing radiology department has to think about all areas and aspects of good business practice, of course, but metrics should allow us to do the most important thing better and more efficiently. That most important thing is taking care of patients.” Roll Up and Drill Down James M. Thrall, MD, is radiologist-in-chief at the Massachusetts General Hospital in Boston and is Juan M. Taveras professor of radiology at Harvard Medical School. In “Using a Dashboard to Manage a Radiology Department,” his section of the three-part presentation, he emphasizes that access to business intelligence is vital to the sound management of today’s radiology departments. Without that layer to integrate data from the RIS, PACS, HIS, and billing/financial systems, the radiology department remains data rich and information poor, Thrall says. If a radiology department is trying to base its everyday operational decisions (not to mention its longer-term strategies) on information compiled manually from disparate databases, it is more than inefficient; it is probably ineffective as well—because it is underinformed. Hospitals (and radiology departments), Thrall says, not only have access to mountains of data through their HIS, RIS, PACS, and billing/financial information systems, but expend considerable time and effort collecting these facts. Nonetheless, the resulting collections have not been particularly helpful in the management of radiology departments or their parent institutions because it has been difficult (ranging from cumbersome to impossible) to organize the available data into useful formats. The RIS and PACS, in particular, do not usually have the report-construction abilities that a radiology department needs in order to keep track of its performance and its day-to-day status. These systems were not designed for reporting purposes, and they cannot readily perform the cross-referencing outside their own databases that is called for in constructing useful analytical material for radiology departments. Thrall says, “Most departments do not have robust business-intelligence tools. They will not come from the HIS, the RIS, or the PACS. They will only come from the construction or adoption of freestanding programs.” This is why business intelligence, including access to scorecards and dashboards, has become so necessary, he adds. By integrating data that are already being collected, business-intelligence systems can turn disparate fact collections into relevant dashboard categories that provide accurate, up-to-date information.

image
Figure 3. Hierarchy of business intelligence; image adapted with permission of James M. Thrall, MD.

Thrall calls manual data compilation the swivel-chair method, since someone must retrieve and print data from one of the organization’s information systems and then turn to another system to enter those data into it before any useful analysis can take place. “We know the benefits of adopting data warehousing and data normalization. If that’s done, then a lot more can be extracted electronically. If that’s not done, then it’s back to printing out data on one system and manually entering data in another,” Thrall says. The first step in moving beyond the swivel-chair method is to determine what matters most in operating a radiology department. Those key performance indicators, once chosen, then form the level below the dashboard—the foundation of dozens of variables upon which it is built and to which the dashboard’s approximately 20 categories give access when greater detail is desired. Key performance indicators form the level of information that the dashboard user can view by drilling down; the bigger picture, containing four to six overarching topics, is accessed by rolling up from the dashboard (Figure 3). Key performance indicators are the measurements and parameters that tell radiology departments where they stand in terms of financial performance, quality of care, efficiency, and satisfaction (of patients, referring physicians, and other stakeholders within and outside the institution). Thrall says, “We define the key performance indicator, we define the term, and we define who is accountable for it.” Units, metrics, and targets for the indicator are then determined. These are not shown directly on a well-constructed dashboard because they are too numerous (Thrall’s department, for example, began with 155 key performance indicators). Instead, the dashboard aggregates the key performance indicators in ways that make sense to users and allow them to drill down or roll up quickly, as needed, to the information that they require to support the decisions that they must make. “The value of dashboards comes from the accessibility of the data. How many clicks does it take to get the desired data? If it’s more than a couple of clicks, it’s too many,” Thrall says. For example, he adds, a service-line manager would be interested in a level of detail that would not normally require the attention of the department chair. From the same dashboard gauge, therefore, the manager could drill down for access to the key performance indicators, and the department chair could roll up to determine how the current status shown by the gauge affects (and is affected by) the other dashboard categories in the same broad strategic area. Different dashboard users who are viewing referral patterns, for example, might be interested in levels of detail ranging from the overall geographical distribution of patients’ home addresses to the particular communities where patients live to the specific physicians who referred them for imaging exams (or the imaging modalities and diagnoses noted for those referrals). Thrall stresses, however, that a properly constructed dashboard is useful to the department (and the organization) as a whole, not just to department managers and administrators. Medical personnel tend to respect data-driven, objective approaches to management, and staff members might be more likely to respond positively to change when it is supported by clear and relevant data. “The reason that it’s so powerful is that objective data lead to objective discussions and objective thinking. Without objectivity, many people cannot distinguish between being busy and being productive,” he says. “Everyone thinks that he or she is busy, and a lot of people feel that they are overworked, but when you actually have objective data for them to look at, it changes their attitudes.” Start Now Thrall reassures the analytically timid that it is not necessary to reinvent the wheel in building a dashboard, and Johnson emphasizes that customization is not the same as starting from scratch. Both report that the work already done (some of which has been published or presented) by other institutions can be a source of key performance indicators that can then be made specific to one’s own radiology department. “Both,” Chang concludes, is the best answer to the build-or-buy question. Make full use of the organization’s interface/integration team and of your data sources, as well as of the staff or outsourced service that now creates your management reports. Take advantage of the services of good business-intelligence/analytics consultants if your in-house expertise is incomplete. When it’s time for the last step in building a dashboard—creating the user interface—look at the packages and services available from the dashboard companies already active in the radiology field, Chang adds. External and existing IT resources can be blended to obtain the ideal business-intelligence system—and the dashboard that makes it useful every day.

Kris Kyes,

Contributor

Around the web

The patient, who was being cared for in the ICU, was not accompanied or monitored by nursing staff during his exam, despite being sedated.

The nuclear imaging isotope shortage of molybdenum-99 may be over now that the sidelined reactor is restarting. ASNC's president says PET and new SPECT technologies helped cardiac imaging labs better weather the storm.

CMS has more than doubled the CCTA payment rate from $175 to $357.13. The move, expected to have a significant impact on the utilization of cardiac CT, received immediate praise from imaging specialists.