The Quest for Quality in Radiology
With Quality Counts as its theme for 2009, the RSNA’s 95th Scientific Assembly and Annual Meeting in Chicago, Illinois, obviously emphasized multiple aspects of quality assurance, control, and improvement. On December 1, several multispeaker sessions had an even stronger focus on the practical steps that radiology providers can (and should) take to promote high quality in their operations and in their staff performance. Three of the presenters were particularly generous in sharing their experience and insight.
Jonathan B. Kruskal, MD, PhD
James R. Duncan, MD, PhD
Lucy W. Glenn, MD Jonathan B. Kruskal, MD, PhD, is chair of the department of radiology and director of quality assurance at Beth Israel Deaconess Medical Center, Boston, Massachusetts, and is a professor of radiology at Harvard Medical School. He presented “Anatomy and Pathophysiology of Errors in Radiology Practice,” stressing that the quality-improvement field is beset by myth in some areas. For example, it is often believed that errors are random occurrences, but they are actually attributable to factors that can be detected and corrected. Likewise, he says, the notion that properly trained professionals rarely commit errors is false. James R. Duncan, MD, PhD, associate professor of radiology in the division of interventional radiology, Mallinckrodt Institute of Radiology, Washington University School of Medicine, St Louis, Missouri, noted the importance of analyzing and then preventing these professional errors in “Assessing Physician Performance.” This process is vital not only to improving patient care but to enhancing public perception of the value of imaging services. He says that members of the public are “spending an incredible amount of money on health care, and their impression is that they’re not getting their money’s worth.” The quest to improve quality and reverse that impression should begin, Kruskal notes, with the division of errors of all common types in radiology into two primary categories: process errors and professional errors. Technical flaws, incorrect protocols, organizational and cultural failings, poor workflow patterns, substandard work environments, and mistake-prone procedural habits all fall into the category of process errors. Professional errors encompass everything that a radiologist (or a technologist, nurse, or other staff member) might do incorrectly, either habitually or by making an isolated mistake. Process Errors Lucy W. Glenn, MD, chief of radiology at Virginia Mason Medical Center, Seattle, Washington, presented “Strategies for Minimizing Errors in Diagnostic Imaging.” Her institution has addressed process errors through the use of a system of patient-safety alerts. Each reported alert is given a color assignment based on how serious the event is. A yellow alert can be managed within the department, for example, but a vice president becomes involved in investigating orange alerts, and the processes involved might need to be stopped during that evaluation. If an alert is considered capable of harming the patient, it is designated red and is investigated within 24 hours by multiple departments. Kruskal says, “When something goes wrong, it’s easy to identify who made the error, but there are often plenty of other associated factors that contributed to that error.” These are the process-related elements that call for changing the situation, not the people involved, he adds. As an example, he cites the need to eliminate reading-room problems such as intrusive teaching rounds, telephone interruptions, and excessive ambient light before complaining that some radiologists are too easily distracted. Duncan also supports beginning any quality-improvement program with a thorough understanding of process errors. While there are several competing and complementary proprietary systems in wide use for finding problems and correcting them, he says, “It all boils down to the scientific method. Study the past to gain knowledge, and use that knowledge to influence the future.” Glenn recommends three strategies: First, for any error that threatens patient safety, do what a factory would call stopping the line; in other words, end the processes involved until the cause of the problem can be found. Second, see that errors are corrected both immediately and, if possible, where they began. Third, ensure that safety considerations are part of every step within each process. It is critically important, she continues, to change each error-prone process in ways that make future mistakes far less likely (or even impossible). By linking the PACS and dictation software properly, for example, the error of dictating a report for the wrong patient can be made impossible. She says, “You want to get from checking for defects to preventing errors.” Kruskal concurs, noting, “For an error to occur, a constellation of contributing factors is usually present, but good detection and analytical processes can minimize the occurrence of subsequent errors.” Even though process errors are attributable primarily to systems, he adds, the psychological effects that they have on radiologists should not be overlooked. “This is often an unanticipated consequence of error, and you need processes in place to address it,” he says. Professional Errors Kruskal describes the main types of radiologist/staff problems affecting quality as underreading, complacency, faulty reasoning, lack of knowledge, and poor communication. Of course, these can be seen in many combinations, and even the types of mistakes that an individual radiologist typically makes will change over time. What Glenn calls behavioral drift is common, she says, and can affect anyone’s performance; therefore, vigilance in protecting quality must be constant. Even radiologists’ errors that seem to have no effect are important to note, Kruskal adds, because they are the warning flags that show where improvements are needed. For example, he says, “If you miss a small sclerotic lesion, but it’s of no consequence to the patient, that’s still an error. It’s a symptom of a flawed underlying process.” Noting that considerable change in dealing with professional errors has already occurred, Glenn asks, “What happens when you find the error, and it’s obviously a human issue? Health care used to be a blame culture. About the mid-1990s, we changed over to a systems approach, acknowledging human fallibility, but there was no accountability for those individuals who displayed unsafe behavior. Somewhere between the two is where we want to be.” Because a basic error rate of about 3% is considered typical in all human activities, Glenn adds, perfection is sometimes achievable only after the fact. While prevention of errors is ideal, she says, “Mistakes are inevitable, but reversible. If a system picks up an error before it’s too far along, and the mistake is addressed, you can reach zero defects.” Duncan uses a three-step assessment of physician performance, with the emphasis always placed on a scientific approach: data collection, the first step, is followed by the analysis of trends. The third step uses the trends found in step two to predict what will happen next and how it can be changed. Duncan says, “Inferences are fragile. That’s why having hard data is so crucial—as they say, you can’t improve what you can’t measure.” Measuring errors (and ensuring that all of them are detected) first calls for what could, in many settings, be a profound cultural change, Kruskal says. Personnel must agree to report their own errors, along with those of others, before any realistic assessment of professional mistakes can occur, and at his hospital, three years passed before full implementation of this shift was achieved. Now, however, staff members not only report but often deal with their own errors. When those mistakes could involve harm to a patient, the patient must also be informed. The next steps, Glenn reports, should be based on the nature of the error. She says, “Human error occurs; when it’s an honest mistake, you don’t want to punish that person, but there’s also at-risk behavior.” Those who behave recklessly must be subject to penalties, but those displaying at-risk behavior need coaching instead. In cases of simple human error, it is more appropriate to console the person who made the mistake, she adds. A pattern of error will naturally be treated using a more formal evaluation than a single or infrequently repeated event. Physician assessment of this kind, Duncan says, is best conducted using four separate models: domain, task, evidence, and decision. The domain model first lists the skills that the radiologist should possess. The task model then is used to determine how those skills can be evaluated, with the evidence model used to decide how that evaluation will be scored. Determining what will be done with (and about) the scores is the function of the decision model. Finding Quality In conducting both physician assessments and process evaluations, Glenn stresses, anything less than an ongoing search for perfection is inadequate. For example, she notes that accepting 99.9% reliability in the airline industry would result in two large crashes every week. “That’s why having zero defects has to be the goal,” she says. That goal is best pursued by coupling good systems with good behavior so that errors of both major types (and all subtypes) will be not only detected and corrected, but prevented. In particular, any hospital that is seeking a radiology group to undertake some or all of its interpretations must extend its review of quality to that group. To support the facility’s quality-improvement agenda, the external group must be fully prepared to address both process errors and professional errors; it should exhibit not only a high level of commitment to quality, but also the presence of the established capabilities and systems required to pursue error-free operation. If that group can also help the hospital fine-tune its internal processes to prevent mistakes and improve patient safety, it will serve as a valuable quality booster as well. Open, complete communication and cooperation are vital to maintaining and improving quality on both sides. Glenn adds that tolerating error is intolerable in the pursuit of quality. She says, “We, as health care workers, have to change our mindset. What the typical organization wants is very few defects. We have to embrace what the patient wants—which is no defects. We have to get to a mindset where we think that perfection is possible and injuries are avoidable.” As Kruskal says, Gunderman and Burdick summed up the importance of the strategic pursuit of quality in radiology when they wrote, “It is not the occurrence of error that is damning, but the failure to seize on it as an opportunity for improvement.”1Kris Kyes is technical editor of ImagingBiz.com.
Jonathan B. Kruskal, MD, PhD
James R. Duncan, MD, PhD
Lucy W. Glenn, MD Jonathan B. Kruskal, MD, PhD, is chair of the department of radiology and director of quality assurance at Beth Israel Deaconess Medical Center, Boston, Massachusetts, and is a professor of radiology at Harvard Medical School. He presented “Anatomy and Pathophysiology of Errors in Radiology Practice,” stressing that the quality-improvement field is beset by myth in some areas. For example, it is often believed that errors are random occurrences, but they are actually attributable to factors that can be detected and corrected. Likewise, he says, the notion that properly trained professionals rarely commit errors is false. James R. Duncan, MD, PhD, associate professor of radiology in the division of interventional radiology, Mallinckrodt Institute of Radiology, Washington University School of Medicine, St Louis, Missouri, noted the importance of analyzing and then preventing these professional errors in “Assessing Physician Performance.” This process is vital not only to improving patient care but to enhancing public perception of the value of imaging services. He says that members of the public are “spending an incredible amount of money on health care, and their impression is that they’re not getting their money’s worth.” The quest to improve quality and reverse that impression should begin, Kruskal notes, with the division of errors of all common types in radiology into two primary categories: process errors and professional errors. Technical flaws, incorrect protocols, organizational and cultural failings, poor workflow patterns, substandard work environments, and mistake-prone procedural habits all fall into the category of process errors. Professional errors encompass everything that a radiologist (or a technologist, nurse, or other staff member) might do incorrectly, either habitually or by making an isolated mistake. Process Errors Lucy W. Glenn, MD, chief of radiology at Virginia Mason Medical Center, Seattle, Washington, presented “Strategies for Minimizing Errors in Diagnostic Imaging.” Her institution has addressed process errors through the use of a system of patient-safety alerts. Each reported alert is given a color assignment based on how serious the event is. A yellow alert can be managed within the department, for example, but a vice president becomes involved in investigating orange alerts, and the processes involved might need to be stopped during that evaluation. If an alert is considered capable of harming the patient, it is designated red and is investigated within 24 hours by multiple departments. Kruskal says, “When something goes wrong, it’s easy to identify who made the error, but there are often plenty of other associated factors that contributed to that error.” These are the process-related elements that call for changing the situation, not the people involved, he adds. As an example, he cites the need to eliminate reading-room problems such as intrusive teaching rounds, telephone interruptions, and excessive ambient light before complaining that some radiologists are too easily distracted. Duncan also supports beginning any quality-improvement program with a thorough understanding of process errors. While there are several competing and complementary proprietary systems in wide use for finding problems and correcting them, he says, “It all boils down to the scientific method. Study the past to gain knowledge, and use that knowledge to influence the future.” Glenn recommends three strategies: First, for any error that threatens patient safety, do what a factory would call stopping the line; in other words, end the processes involved until the cause of the problem can be found. Second, see that errors are corrected both immediately and, if possible, where they began. Third, ensure that safety considerations are part of every step within each process. It is critically important, she continues, to change each error-prone process in ways that make future mistakes far less likely (or even impossible). By linking the PACS and dictation software properly, for example, the error of dictating a report for the wrong patient can be made impossible. She says, “You want to get from checking for defects to preventing errors.” Kruskal concurs, noting, “For an error to occur, a constellation of contributing factors is usually present, but good detection and analytical processes can minimize the occurrence of subsequent errors.” Even though process errors are attributable primarily to systems, he adds, the psychological effects that they have on radiologists should not be overlooked. “This is often an unanticipated consequence of error, and you need processes in place to address it,” he says. Professional Errors Kruskal describes the main types of radiologist/staff problems affecting quality as underreading, complacency, faulty reasoning, lack of knowledge, and poor communication. Of course, these can be seen in many combinations, and even the types of mistakes that an individual radiologist typically makes will change over time. What Glenn calls behavioral drift is common, she says, and can affect anyone’s performance; therefore, vigilance in protecting quality must be constant. Even radiologists’ errors that seem to have no effect are important to note, Kruskal adds, because they are the warning flags that show where improvements are needed. For example, he says, “If you miss a small sclerotic lesion, but it’s of no consequence to the patient, that’s still an error. It’s a symptom of a flawed underlying process.” Noting that considerable change in dealing with professional errors has already occurred, Glenn asks, “What happens when you find the error, and it’s obviously a human issue? Health care used to be a blame culture. About the mid-1990s, we changed over to a systems approach, acknowledging human fallibility, but there was no accountability for those individuals who displayed unsafe behavior. Somewhere between the two is where we want to be.” Because a basic error rate of about 3% is considered typical in all human activities, Glenn adds, perfection is sometimes achievable only after the fact. While prevention of errors is ideal, she says, “Mistakes are inevitable, but reversible. If a system picks up an error before it’s too far along, and the mistake is addressed, you can reach zero defects.” Duncan uses a three-step assessment of physician performance, with the emphasis always placed on a scientific approach: data collection, the first step, is followed by the analysis of trends. The third step uses the trends found in step two to predict what will happen next and how it can be changed. Duncan says, “Inferences are fragile. That’s why having hard data is so crucial—as they say, you can’t improve what you can’t measure.” Measuring errors (and ensuring that all of them are detected) first calls for what could, in many settings, be a profound cultural change, Kruskal says. Personnel must agree to report their own errors, along with those of others, before any realistic assessment of professional mistakes can occur, and at his hospital, three years passed before full implementation of this shift was achieved. Now, however, staff members not only report but often deal with their own errors. When those mistakes could involve harm to a patient, the patient must also be informed. The next steps, Glenn reports, should be based on the nature of the error. She says, “Human error occurs; when it’s an honest mistake, you don’t want to punish that person, but there’s also at-risk behavior.” Those who behave recklessly must be subject to penalties, but those displaying at-risk behavior need coaching instead. In cases of simple human error, it is more appropriate to console the person who made the mistake, she adds. A pattern of error will naturally be treated using a more formal evaluation than a single or infrequently repeated event. Physician assessment of this kind, Duncan says, is best conducted using four separate models: domain, task, evidence, and decision. The domain model first lists the skills that the radiologist should possess. The task model then is used to determine how those skills can be evaluated, with the evidence model used to decide how that evaluation will be scored. Determining what will be done with (and about) the scores is the function of the decision model. Finding Quality In conducting both physician assessments and process evaluations, Glenn stresses, anything less than an ongoing search for perfection is inadequate. For example, she notes that accepting 99.9% reliability in the airline industry would result in two large crashes every week. “That’s why having zero defects has to be the goal,” she says. That goal is best pursued by coupling good systems with good behavior so that errors of both major types (and all subtypes) will be not only detected and corrected, but prevented. In particular, any hospital that is seeking a radiology group to undertake some or all of its interpretations must extend its review of quality to that group. To support the facility’s quality-improvement agenda, the external group must be fully prepared to address both process errors and professional errors; it should exhibit not only a high level of commitment to quality, but also the presence of the established capabilities and systems required to pursue error-free operation. If that group can also help the hospital fine-tune its internal processes to prevent mistakes and improve patient safety, it will serve as a valuable quality booster as well. Open, complete communication and cooperation are vital to maintaining and improving quality on both sides. Glenn adds that tolerating error is intolerable in the pursuit of quality. She says, “We, as health care workers, have to change our mindset. What the typical organization wants is very few defects. We have to embrace what the patient wants—which is no defects. We have to get to a mindset where we think that perfection is possible and injuries are avoidable.” As Kruskal says, Gunderman and Burdick summed up the importance of the strategic pursuit of quality in radiology when they wrote, “It is not the occurrence of error that is damning, but the failure to seize on it as an opportunity for improvement.”1Kris Kyes is technical editor of ImagingBiz.com.