Are electronic QA tools introducing new hazards to the radiology suite?
Swapping traditional paper checklists for digital alternatives could cut the time physicists and dosimetrists spend on quality assurance (QA) within radiation therapy, researchers have reported in Practical Radiology Oncology. But it’s still unclear whether an electronic approach will really improve patient safety or quality of care.
Part of the recent shift toward safety mindfulness in radiation has been the implementation of several in-house electronic QA tools, first author Gregg S. Tracton, BSE, and colleagues in the department of radiation oncology at the University of North Carolina, Chapel Hill, wrote. Still, the current medical standard involves a paper metrics checklist that’s manually completed by a physicist or dosimetrist prior to the initiation of therapy.
“Some of the patient safety challenges in [radiation therapy] are claimed to be caused by imperfectly designed and misused software within the health information technology ecosystem,” Tracton et al. said. “Indeed, to some degree the elimination of paper from our workflows has uncapped the potential for interactively complex documentation and communication failures.”
Automated QA systems to date have shown some success in reporting discrepancies and concerns in patient plans prior to treatment, the authors wrote—but clinicians also recognize that some checks require human-based cognition and should be completed manually.
“It appears to be commonly accepted that automated QA tools are efficient and effective at eliminating specific clinical faults while reducing the cognitive burden on radiation therapy clinicians to conduct all the work manually,” Tracton and co-authors wrote. “Researchers also acknowledge that it is possible that automated QA tools, if not properly designed to support cognitive workflows of [radiation therapy] clinicians, might actually be introducing new hazards to patient safety and plan quality.”
In an effort to create a digital checklist that works in favor of clinicians while also reducing cognitive workload and time spent completing QA tasks, the researchers, equipped with radiation oncologists, dosimetrists, physicists, human factor engineers and a computer scientists, designed a list based around three principles.
The first, patient context, ensured clinicians had access to an overview of the patient’s case, task summaries and potential pitfalls. Because dosimetrists are so often multitaskers, the authors wrote, providing context is important to promote adherence to policy caused by confusion or interruptions.
The checklist also contained a dosimetry check, or a step-by-step detailed analysis of the patient’s plan, which generated physics data points and automatically performed 34 checks against departmental policy and procedures, and cross-checks to ensure the clinician balances planning systems, delivery systems and practice policy.
Testing the electronic list in a group of 20 radiation therapy clinicians from three practices, Tracton and colleagues found the 10 participants who were recruited to perform QA using the new list averaged a time-to-complete window of 22 minutes compared to the other half of the group, who averaged 34 minutes with the conventional paper checklist.
The former cohort also reported increases in documentation of additional patient safety and plan quality concerns, according to the report—but implementing the electronic list didn’t seem to have a significant impact on increasing the recognition of purposefully embedded errors in the data, or perceptions of workload. For that reason, the authors said, more research on the subject is warranted.
“Results show that our electronic checklist improved recognition and documentation of additional patient safety and plan quality concerns,” Tracton et al. wrote. “This suggests that our electronic checklists may have engaged participants in higher levels of safety mindfulness, a desired state of detecting and reporting more potential failures.”