Canadian Medical Education Journal (Nov 2024)
Evaluating facilitator adherence to a newly adopted simulation debriefing framework
Abstract
Background: Post-simulation debriefing is a critical component of the learning process for simulation based medical education, and multiple frameworks have been established in an attempt to maximize learning during debriefing through guided reflection. This study developed and applied a rubric to measure facilitator adherence to the newly adopted Promoting Excellence and Reflective Learning in Simulation (PEARLS) debriefing framework to evaluate the efficacy of current faculty development. Methods: A retrospective review of 187 videos using a structured 13-behavior rubric based on the PEARLS debriefing model was conducted of facilitator-learner debriefings following a simulated clinical encounter for medical students. The aggregate results were used to describe common patterns of debriefing and focus future faculty development efforts. Results: In total, 187 debriefings facilitated by 32 different facilitators were analyzed. Average scores for each of the 13 PEARLS framework behaviors ranged from 0.04 to 0.971. Seven items had an average of ≥ 0.77, ten averaged > 0.60 and two averaged < 0.20. Conclusions: Faculty adhered to some behaviors elicited by the PEARLS model more consistently than others. These results suggest that faculty facilitators are more likely to adhere to frameworks that focus on educational behaviors and less likely to adhere to organizational or methodological frameworks.