Quality Control With A Custom Fit
Less than a year ago, the radiology department at the Fletcher Allen Medical Center (FAMC), Burlington, Vermont, the hospital affiliate of the University of Vermont College of Medicine, was struggling with antiquated peer-review and quality-control (QC) methods. For peer review, according to Steven P. Braff, MD, radiology department chair, radiologists were relying on paper cards that had to be filled out by hand. Busy physicians balked at the added work, even though peer review is mandated by the Joint Commission and called for by the ACR®.
Steven P. Braff, MD Moreover, Braff says, the FAMC radiology department had no way to formalize quality reviews of radiologic technologists. If radiologists noticed mistakes in a technologist’s imaging, they had to find a supervisor and follow a cumbersome reporting procedure. There was a similar burdensome procedure for handling mistakes made by radiologists themselves. In this instance, Braff says, the radiologist who spotted a possible interpretation error had to take the awkward step of questioning the work of a fellow physician. Now these missed-materials cases can be handled electronically by any radiologist spotting what he or she thinks is a mistake. The alleged mistake can be identified in the PACS, and the case can either be handled at once or (if the mistake is noncritical or the case is nonactive) sent for review at a missed-materials conference that the radiology department holds monthly, Braff says. The same software application that allows missed materials to be handled electronically using FAMC’s PACS also generates electronic peer reviews and electronic reviews of technologists, all by using drop-down menus and simple click responses, with an option for written comments. This electronic quality-review capability has given the radiology department a much better handle on QC, Braff says. It’s fast and easy for radiologists to do peer reviews or respond to errors with a few mouse clicks. He estimates that since installing the QC software on the PACS, the department has quadrupled its volumes of peer reviews, technologist reviews, and missed-materials reports. “If you tell physicians reading cases to pick up a pen and fill out a card, they just are not going to be able to do it nowadays, as busy as physicians are,” Braff says, “but with this, there’s no excuse not to do it. It’s really easy.” To meet its peer-review and other quality- control needs, FAMC turned to peerVue (Sarasota, Florida), a QC software vendor that also offers consultation and customization. The latter turned out to be handy because FAMC wanted a range of custom features in its electronic QC programs. FAMC is licensed for 562 beds and has more than 500 faculty physicians, including 35 radiologists on staff and an additional 30 radiology residents and fellows. The radiology department completes about 300,000 imaging studies per year, Braff notes. Like many other faculty physicians, Braff holds multiple appointments. In addition to being department chair, he is professor of radiology and professor of neurology and neurological surgery. Braff is a hospital employee, as are the other FAMC radiologists. In addition to reading directly for the hospital, they read for other facilities with which the hospital has contracted. These contracts can be for preliminary and final interpretations, as well as overreads (including night-coverage services). FAMC itself does not use outside night coverage, Braff says. Peer Review The peer-review application that FAMC uses is more or less what peerVue offers out of the box. For this software, peerVue gets an enterprise fee. Braff describes the peerVue package as nearly identical to the peer-review categories used by the ACR. “I believe it’s exactly the same; it certainly passes muster,” he says. Of course, the radiology department has to decide how to use the software. One of FAMC’s decisions, Braff says, has been to forgo protocols and, instead, to encourage radiologists to do peer review by using a first-case-of-the-day format. “I like them to review the first case that they read,” Braff says. “Just click on the prior study and drop down the peer review. Look at the old report and see if it’s the same as your opinion. That’s the random piece off the worklist; they don’t know what they’re going to read..” In the peer-review window, Braff says, the radiologist can click on “agree,” or on alternatives such as “diagnosis not ordinarily expected” or “missed finding.” The reviewer can also add written comments. When the review is complete, the reviewer clicks to submit it, and the review is electronically stored. All peer reviews then go to the chief quality officer in radiology; the officer analyzes each case and “reports to me if there is anything that needs to be done,” Braff says. If each radiologist peer reviews one case per day, there are more than enough reviews to meet Joint Commission requirements. At night, a lot of the reading at FAMC is done by residents; these cases are always overread the next day by staff radiologists, using a slightly customized electronic format similar to that of peer review. The residents see the results. “We don’t send anything if we agree with them,” Braff says, “but if we don’t, we hit ‘disagree,’ either major or minor.” There can be written comments. The reviewers can also hit “kudos” if they think that a resident has completed superlative work. “We give plenty of kudos as well,” Braff says. Missed Materials It’s vital, Braff says, to segregate missed-materials cases from peer reviews, so when a radiologist spots a mistake by a colleague, a different drop-down window is used. If every report of a radiologist’s error turned into a peer review, Braff says, then the peer reviews for any one radiologist would be error-ridden examples unrepresentative of that radiologist’s work. The response choices in the missed-materials window are also different, Braff says. They include “missed finding,” “misinterpretation,” “calling it the wrong thing,” “overcalling it,” “calling it something it really isn’t,” or “kudos,” he says, noting that radiologists rarely pass out kudos among themselves. Like the peer reviews, the missed-materials reports go to the chief quality officer; unlike the peer reviews, the missed-materials cases are presented at a monthly conference. There is no protocol for looking for missed-materials cases. They are reported, Braff says, whenever someone spots an error. “A busy neuroradiologist might read 100 cases a day and spot one or two with major or minor discrepancies in a prior study,” Braff says. In the monthly conferences, he adds, there is usually a missed-materials case or two that warrants follow-up. “That’s in any department,” Braff says. “It would be disingenuous to say otherwise.” In the past, when noncritical mistakes were spotted, they often went unreported, Braff says, but this is no longer the case. As with peer review, the quality officer is always looking for patterns showing that a radiologist might need re-education for repeated mistakes. Customized QC From the beginning, FAMC’s radiology department knew that it wanted quality-reporting tools beyond the scope of off-the-shelf applications, Braff says. It is these specialized electronic QC tools, Braff adds, that mark the difference between conventional health care and extraordinary care. One of the first things that Braff set peerVue’s programmers to work on was development of a QC drop-down window for reporting errors by technologists when radiologists spotted them. That capability is now in use with the PACS, and it is adding consistency and saving time, Braff says. It will improve technologists’ performance, he hopes, although he says that the system is too new to pinpoint results. If nothing else, he says, technologists’ supervisors will be able to spot patterns and take remedial action. The technologist-review window, Braff says, is divided into areas for angiography, ultrasound, MRI/CT, and diagnostic radiology. The radiologist clicks the appropriate category and pinpoints the error in imaging. “If I open diagnostic radiology,” Braff says, “I can check off ‘motion,’ ‘poor exposure,’ ‘poor positioning,’ ‘incorrect number of views,’ ‘kudos,’ or ‘other.’ If it’s ‘other,’ you just fill in the comment,” he says. The technologist-review software brings up the technologist’s name and the accession number of the study being reviewed. The critique “goes right to the supervisor, depending on the columns we check,” Braff says. The supervisors can then spot consistent errors and take remedial action. “It becomes an educational tool,” Braff says. “It’s great in its simplicity.” Tagged Incidental Findings A second custom application on FAMC’s new QC array has Braff equally excited: The goal is to ensure follow-up for incidental findings that might or might not become serious disease. “We all have a way of handling findings of critical values—a hemorrhage of the brain, pulmonary embolism, or aortic-aneurysm rupture,” Braff says, “but what about that chest radiograph where you spot a tiny little nodule you don’t think is too important, but you’re not sure?” Every year, patients die because incidental findings, though reported initially, don’t get follow-up attention, Braff says. FAMC is putting electronic tags on these incidental findings, so that after a stipulated period of time, the radiologist will get a reminder to take another look at that case. “If you see that nothing’s been done, then get on the horn and call that physician,” Braff says. The contact doesn’t have to be as time consuming as a call; it could also be an automatic fax transmission or an electronic note. “You could set that up any way you want to,” Braff says. If health reform succeeds in bringing millions of new covered patients on board, radiologists and other physicians are going to be busier than ever, he adds. “There will be a lot more opportunities to forget things or forget findings,” he says. “These electronic reminders are easy information to integrate into the PACS and are easy additions to workflows. I think that they not only have a place, clearly, but are critical. We can’t do our jobs as well without them.” With its new QC system, FAMC is gathering more data than ever. Eventually, Braff says, there’s sure to be a quantifiable payoff, but the system is not yet a year old, and it’s too early to tell what that payoff will be. “It’s great for education. I already know it makes a difference that way,” he says. “I can’t tell you that it has altered any individual physician’s profile in terms of performance, at this point, but I expect that it will, over time. It’s the stuff we couldn’t do easily a year ago. Now, we can,” he says.George Wiley is a contributing writer for ImagingBiz.com.
Steven P. Braff, MD Moreover, Braff says, the FAMC radiology department had no way to formalize quality reviews of radiologic technologists. If radiologists noticed mistakes in a technologist’s imaging, they had to find a supervisor and follow a cumbersome reporting procedure. There was a similar burdensome procedure for handling mistakes made by radiologists themselves. In this instance, Braff says, the radiologist who spotted a possible interpretation error had to take the awkward step of questioning the work of a fellow physician. Now these missed-materials cases can be handled electronically by any radiologist spotting what he or she thinks is a mistake. The alleged mistake can be identified in the PACS, and the case can either be handled at once or (if the mistake is noncritical or the case is nonactive) sent for review at a missed-materials conference that the radiology department holds monthly, Braff says. The same software application that allows missed materials to be handled electronically using FAMC’s PACS also generates electronic peer reviews and electronic reviews of technologists, all by using drop-down menus and simple click responses, with an option for written comments. This electronic quality-review capability has given the radiology department a much better handle on QC, Braff says. It’s fast and easy for radiologists to do peer reviews or respond to errors with a few mouse clicks. He estimates that since installing the QC software on the PACS, the department has quadrupled its volumes of peer reviews, technologist reviews, and missed-materials reports. “If you tell physicians reading cases to pick up a pen and fill out a card, they just are not going to be able to do it nowadays, as busy as physicians are,” Braff says, “but with this, there’s no excuse not to do it. It’s really easy.” To meet its peer-review and other quality- control needs, FAMC turned to peerVue (Sarasota, Florida), a QC software vendor that also offers consultation and customization. The latter turned out to be handy because FAMC wanted a range of custom features in its electronic QC programs. FAMC is licensed for 562 beds and has more than 500 faculty physicians, including 35 radiologists on staff and an additional 30 radiology residents and fellows. The radiology department completes about 300,000 imaging studies per year, Braff notes. Like many other faculty physicians, Braff holds multiple appointments. In addition to being department chair, he is professor of radiology and professor of neurology and neurological surgery. Braff is a hospital employee, as are the other FAMC radiologists. In addition to reading directly for the hospital, they read for other facilities with which the hospital has contracted. These contracts can be for preliminary and final interpretations, as well as overreads (including night-coverage services). FAMC itself does not use outside night coverage, Braff says. Peer Review The peer-review application that FAMC uses is more or less what peerVue offers out of the box. For this software, peerVue gets an enterprise fee. Braff describes the peerVue package as nearly identical to the peer-review categories used by the ACR. “I believe it’s exactly the same; it certainly passes muster,” he says. Of course, the radiology department has to decide how to use the software. One of FAMC’s decisions, Braff says, has been to forgo protocols and, instead, to encourage radiologists to do peer review by using a first-case-of-the-day format. “I like them to review the first case that they read,” Braff says. “Just click on the prior study and drop down the peer review. Look at the old report and see if it’s the same as your opinion. That’s the random piece off the worklist; they don’t know what they’re going to read..” In the peer-review window, Braff says, the radiologist can click on “agree,” or on alternatives such as “diagnosis not ordinarily expected” or “missed finding.” The reviewer can also add written comments. When the review is complete, the reviewer clicks to submit it, and the review is electronically stored. All peer reviews then go to the chief quality officer in radiology; the officer analyzes each case and “reports to me if there is anything that needs to be done,” Braff says. If each radiologist peer reviews one case per day, there are more than enough reviews to meet Joint Commission requirements. At night, a lot of the reading at FAMC is done by residents; these cases are always overread the next day by staff radiologists, using a slightly customized electronic format similar to that of peer review. The residents see the results. “We don’t send anything if we agree with them,” Braff says, “but if we don’t, we hit ‘disagree,’ either major or minor.” There can be written comments. The reviewers can also hit “kudos” if they think that a resident has completed superlative work. “We give plenty of kudos as well,” Braff says. Missed Materials It’s vital, Braff says, to segregate missed-materials cases from peer reviews, so when a radiologist spots a mistake by a colleague, a different drop-down window is used. If every report of a radiologist’s error turned into a peer review, Braff says, then the peer reviews for any one radiologist would be error-ridden examples unrepresentative of that radiologist’s work. The response choices in the missed-materials window are also different, Braff says. They include “missed finding,” “misinterpretation,” “calling it the wrong thing,” “overcalling it,” “calling it something it really isn’t,” or “kudos,” he says, noting that radiologists rarely pass out kudos among themselves. Like the peer reviews, the missed-materials reports go to the chief quality officer; unlike the peer reviews, the missed-materials cases are presented at a monthly conference. There is no protocol for looking for missed-materials cases. They are reported, Braff says, whenever someone spots an error. “A busy neuroradiologist might read 100 cases a day and spot one or two with major or minor discrepancies in a prior study,” Braff says. In the monthly conferences, he adds, there is usually a missed-materials case or two that warrants follow-up. “That’s in any department,” Braff says. “It would be disingenuous to say otherwise.” In the past, when noncritical mistakes were spotted, they often went unreported, Braff says, but this is no longer the case. As with peer review, the quality officer is always looking for patterns showing that a radiologist might need re-education for repeated mistakes. Customized QC From the beginning, FAMC’s radiology department knew that it wanted quality-reporting tools beyond the scope of off-the-shelf applications, Braff says. It is these specialized electronic QC tools, Braff adds, that mark the difference between conventional health care and extraordinary care. One of the first things that Braff set peerVue’s programmers to work on was development of a QC drop-down window for reporting errors by technologists when radiologists spotted them. That capability is now in use with the PACS, and it is adding consistency and saving time, Braff says. It will improve technologists’ performance, he hopes, although he says that the system is too new to pinpoint results. If nothing else, he says, technologists’ supervisors will be able to spot patterns and take remedial action. The technologist-review window, Braff says, is divided into areas for angiography, ultrasound, MRI/CT, and diagnostic radiology. The radiologist clicks the appropriate category and pinpoints the error in imaging. “If I open diagnostic radiology,” Braff says, “I can check off ‘motion,’ ‘poor exposure,’ ‘poor positioning,’ ‘incorrect number of views,’ ‘kudos,’ or ‘other.’ If it’s ‘other,’ you just fill in the comment,” he says. The technologist-review software brings up the technologist’s name and the accession number of the study being reviewed. The critique “goes right to the supervisor, depending on the columns we check,” Braff says. The supervisors can then spot consistent errors and take remedial action. “It becomes an educational tool,” Braff says. “It’s great in its simplicity.” Tagged Incidental Findings A second custom application on FAMC’s new QC array has Braff equally excited: The goal is to ensure follow-up for incidental findings that might or might not become serious disease. “We all have a way of handling findings of critical values—a hemorrhage of the brain, pulmonary embolism, or aortic-aneurysm rupture,” Braff says, “but what about that chest radiograph where you spot a tiny little nodule you don’t think is too important, but you’re not sure?” Every year, patients die because incidental findings, though reported initially, don’t get follow-up attention, Braff says. FAMC is putting electronic tags on these incidental findings, so that after a stipulated period of time, the radiologist will get a reminder to take another look at that case. “If you see that nothing’s been done, then get on the horn and call that physician,” Braff says. The contact doesn’t have to be as time consuming as a call; it could also be an automatic fax transmission or an electronic note. “You could set that up any way you want to,” Braff says. If health reform succeeds in bringing millions of new covered patients on board, radiologists and other physicians are going to be busier than ever, he adds. “There will be a lot more opportunities to forget things or forget findings,” he says. “These electronic reminders are easy information to integrate into the PACS and are easy additions to workflows. I think that they not only have a place, clearly, but are critical. We can’t do our jobs as well without them.” With its new QC system, FAMC is gathering more data than ever. Eventually, Braff says, there’s sure to be a quantifiable payoff, but the system is not yet a year old, and it’s too early to tell what that payoff will be. “It’s great for education. I already know it makes a difference that way,” he says. “I can’t tell you that it has altered any individual physician’s profile in terms of performance, at this point, but I expect that it will, over time. It’s the stuff we couldn’t do easily a year ago. Now, we can,” he says.George Wiley is a contributing writer for ImagingBiz.com.