Western Journal of Emergency Medicine (Jan 2017)

Who to Interview? Low Adherence by US Medical Schools to Medical Student Performance Evaluation Format Makes Resident Selection Difficult

  • Osborn, Megan Boysen,
  • Yanuck, Justin,
  • Mattson, James,
  • Toohey, Shannon,
  • Lahham, Shadi,
  • Wray, Alisa,
  • Wiechmann, Warren,
  • Langdorf, Mark

DOI
https://doi.org/10.5811/westjem.2016.10.32233
Journal volume & issue
Vol. 18, no. 1
pp. 50 – 55

Abstract

Read online

The Medical Student Performance Evaluation (MSPE) appendices provide a program director with comparative performance for a student’s academic and professional attributes, but they are frequently absent or incomplete. We reviewed MSPEs from applicants to our emergency medicine residency program from 134 of 136 (99%) US allopathic medical schools, over two application cycles (2012-13, 2014-15). We determined the degree of compliance with each of the five recommended MSPE appendices. Only three (2%) medical schools were compliant with all five appendices. The medical school information page (MSIP, appendix E) was present most commonly (85%), followed by comparative clerkship performance (appendix B, 82%), overall performance (appendix D, 59%), preclinical performance (appendix A, 57%), and professional attributes (appendix C, 18%). Few schools (7%) provided student-specific, comparative professionalism assessments. Medical schools inconsistently provide graphic, comparative data for their students in the MSPE. Although PDs value evidence of an applicant’s professionalism when selecting residents, medical schools rarely provide such useful, comparative professionalism data in their MSPEs. As PDs seek to evaluate applicants based on academic performance and professionalism, rather than standardized testing alone, medical schools must make MSPEs more consistent, objective, and comparative.

Keywords