Abstract
BACKGROUND: Competency-based medical education (CBME) is the cornerstone of undergraduate training in Canada. Entrustable professional activities (EPAs) assess competency in professional tasks, with narrative feedback being a key component. There is currently a lack of published literature on the quality of narrative feedback in EPA observations in undergraduate medical education. This study explores the quality of narrative feedback in EPA observations provided to medical students. METHODS: The quality of narrative feedback in a random sample of anonymised EPA observations from Year 3 students was evaluated using the Evaluation of Feedback Captured Tool (EFeCT). The EFeCT tool explores five facets of quality feedback, with a score of five indicating high-quality feedback. Three evaluators independently assessed the quality of narrative feedback using the EFeCT tool. Any differences in score were resolved through discussion to reach consensus. RESULTS: In the 2022-2023 academic year, 15,240 EPA observations were completed for year 3 students. A subset of 748 observations was analysed. Of these, one scored 0, seven scored 1, 33 scored 2, 115 scored 3, 151 scored 4 and 441 scored 5 on the EFeCT tool. The mean EFeCT score was 4.3. CONCLUSIONS: Overall, the majority of EPA narratives provided moderate to high-quality feedback to students. However, variability was evident, and many EPA narratives were missing one or more elements of high-quality feedback. This could result in significant implications for learner development. Addressing contextual factors such as clinical workload and creating faculty development opportunities may support faculty in providing high-quality narrative feedback.