PeerJ (May 2020)

Assessing the growth in clinical skills using a progress clinical skills examination

  • Heather S. Laird-Fick,
  • Chi Chang,
  • Ling Wang,
  • Carol Parker,
  • Robert Malinowski,
  • Matthew Emery,
  • David J. Solomon

DOI
https://doi.org/10.7717/peerj.9091
Journal volume & issue
Vol. 8
p. e9091

Abstract

Read online Read online

Background This study evaluates the generalizability of an eight-station progress clinical skills examination and assesses the growth in performance for six clinical skills domains among first- and second-year medical students over four time points during the academic year. Methods We conducted a generalizability study for longitudinal and cross-sectional comparisons and assessed growth in six clinical skill domains via repeated measures ANOVA over the first and second year of medical school. Results The generalizability of the examination domain scores was low but consistent with previous studies of data gathering and communication skills. Variations in case difficulty across administrations of the examination made it difficult to assess longitudinal growth. It was possible to compare students at different training levels and the interaction of training level and growth. Second-year students outperformed first-year students, but first-year students’ clinical skills performance grew faster than second-year students narrowing the gap in clinical skills over the students’ first year of medical school. Conclusions Case specificity limits the ability to assess longitudinal growth in clinical skills through progress testing. Providing students with early clinical skills training and authentic clinical experiences appears to result in the rapid growth of clinical skills during the first year of medical school.

Keywords