On the Use of Simulation and Assessment in Surgical Training
Loading...
Date
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
The transition from medical school to residency is often considered the most difficult year for both teachers and learners. Learners report feeling underprepared, and some researchers have identified a decrease in patient safety during the first month of residency. These factors suggest learners could be better supported during this transition period. Previous research demonstrates that boot camps (BCs) at the onset of residency can improve learners’ confidence, knowledge, and some technical skills. However, little information has been published on how those BCs were developed and implemented, why BCs only improve some skills and not others, or the long-term impacts of BC programs.
We used a Context, Input, Process, and Product program evaluation framework to develop, implement, and evaluate a simulation-based BC for novice surgical trainees that was aligned with the recent shift towards competency-based models of medical education. Next, we used a Convergent Parallel Mixed Methods approach to explore the longer-term impacts of the BC program. Lastly, we explored how effectively the Objective Structured Clinical Examination (OSCE), a “gold standard” measure of learner competence that was used in the BC program, truly captures clinical performance of novice trainees.
This work demonstrates that incorporating a BC at the onset of residency can improve residents’ confidence and skill for up to two years into training, although adherence to sound pedagogical principles is critical. The BC also provided residents with the opportunity to participate in role clarification, acculturation, and social integration. Finally, we demonstrate that OSCEs may not always be the best way to measure BC effectiveness.
The data presented in this thesis will provide educators with new insights on how to create and evaluate successful BC programs to support learners through the transition to residency; highlight new approaches for evaluating educational initiatives; and prompt a conversation about how assessment is being used in medical education.