MedEdPORTAL (May 2007)

A Validated, Behavior-Based Evaluation Instrument for Family Medicine Residents

  • David Fay,
  • Michael Mazzone,
  • Linda Douglas,
  • Bruce Ambuel

DOI
https://doi.org/10.15766/mep_2374-8265.622
Journal volume & issue
Vol. 3

Abstract

Read online

Abstract Introduction Resident competency reviews often suffer from subjectivity, ambiguity, and the potential for a “ceiling” effect for high performers. To address this we systematically developed, tested, and revised a 32-item assessment tool explicitly linked to the ACGME competencies for use in family medicine residency training programs. Methods After literature review, six experienced faculty created behaviorally-anchored rating scales (BARS) corresponding to the six ACGME competencies. For each item in these scales, a resident behavior that would provide exemplary evidence of competence was defined. Two faculty members then developed a five-column, behaviorally-anchored scale for each exemplary behavior, with a progression of increasing competence. The second, third, and fourth columns, respectively, corresponded to the expected ability level of a first-, second-, and third-year resident, while the first column represented behaviors below the abilities expected of a first-year resident, and the fifth column behaviors above those expected of a third-year resident. Unlike other rating scales this allows a progressive demonstration of competence across a continuum of behaviors over the entire 3-year program. The instrument was then used for our residency's semi-annual review, accumulating 78 total resident evaluations. These reports were then subjected to factor analysis in order to assess internal validity. After the first three cycles of evaluations, 12 residents who had experienced both the previous semi-annual review form and the new BARS were surveyed to assess their satisfaction with the new instrument and the feedback they received. Results The overall Cronbach's alpha of the instrument was 0.98, with individual reliabilities between 0.7585-0.9817, demonstrating high reliability. In factor analysis, 21 of 33 items had a factor loading of > 0.6; only three of the 33 items had a factor loading with another factor within 0.05, demonstrating that most items were specific. After factor analysis, six experienced faculty independently named those factors which were identified in the analysis. These names correlated well with the six ACGME competencies, demonstrating construct validity. Resident surveys indicated positive perceptions of the new instrument, including receiving specific feedback and helping them form strategies for personal improvement. Discussion Using a BARS with expected benchmarks for each level of training has allowed all residents to see the target clearly. The review process has become more objective, with residents and faculty working together to develop individualized goals and objectives that state the next performance level and align with the ACGME competencies. Our instrument has also significantly reduced the ceiling effect, and has reoriented residents not to expect to be at the top of the scale (residents routinely score in the second, third, and fourth columns). Providing specific behavior guidelines has given even talented residents an appropriate goal, something that may not be possible with forms that merely indicate “meets” a competency or is above average. By using established behavior-based items, performance is rated based upon observation of the associated behaviors (or their absence), not on the rater's internal standard. This has also reduced the “halo effect” that can obscure needs for further development of particular abilities in high-achieving residents. For struggling residents, small increments of improvement have been noted and encouraged, and the next expected level has been unambiguous. Finally, the instrument is user-friendly, requiring no special training of raters or residents.

Keywords