4-hour simulated exam successfully evaluates emergency radiology residents
An online simulator built to assess radiology residents during after-hours ER work has succeeded in evaluating dozens of postgraduate residents on their aptitude in the emergency room, a team reported in Academic Radiology this month.
First author Ivan R. Diamond, MD, PhD, and colleagues said in the journal residency programs typically focus on OSCE tests, mock-oral exams and multiple choice in-service exams to evaluate resident competency—all of which tend to rely on limited static images outside the clinical setting.
“The results from these assessments may not fully capture the resident’s competence in the clinical domain,” Diamond, a radiologist at Trillium Health Partners in Mississauga, Ontario, Canada, and a lecturer in the Department of Medical Imaging at the University of Toronto, wrote. “With the transition to competency-based residency education, it will be increasingly important for training programs to have means to assess the resident’s clinical competence in a manner that captures their real-world performance.”
Rather than fall back on traditional testing methods, Diamond and his team developed an emergency radiology simulator for their study using Articulate Storyline, an interactive e-learning course development platform. The simulation was designed to run for nearly four hours, followed by a 15-minute handover activity. Residents started the program with a protocoling exercise, where they were asked to answer 16 multiple choice questions about imaging scenarios and protocols, before moving onto a reporting segment.
“The simulator provided a realistic queue of typical diagnostic radiology studies that would be present in an emergency room setting,” the authors wrote. “In addition to focusing on a more valid assessment of performance in interpreting emergency radiology studies, the simulator aimed to replicate the various activities required for radiology residents in an emergency environment.”
So, instead of simply diagnosing a case correctly based on an image or two, the simulator urges residents to detect abnormalities within a greater volume of images and triage those results with a referring physician, Diamond et al. said.
Medical Expert CanMEDS competencies, such as communicator, leader and health advocate, were assessed with randomized interruptions, protocolling, prioritizing of cases and management of radiation exposure, according to the researchers. Case exercises were geared toward fulfilling the “medical expert” CanMEDS role.
Forty-eight residents participated in the first simulation, the authors said, and showed diverse results. Those who were newer to radiology as a whole performed most poorly.
“Postgraduate year (PGY)-2 residents performed worse on the medical expert domains, although performance did not significantly vary between the other years,” the researchers wrote. “This may suggest that competence in emergency radiology is achieved early in residency, possibly related to the importance placed on developing skills related to on-call performance during the PGY-2 year.”
The researchers said their simulator can—and needs to be—adapted for other subspecialities of radiology. They said their next venture would center around multi-specialty oncologic imaging.
“Our goal is to assess the utility of the simulator for assessing resident performance across various realms of radiology,” Diamond and colleagues said. “It is essential to demonstrate that simulation is valid across a broad range of areas in order for simulation to become a key component of our assessment metrics as we transition to a competency-based residency program.”