Health Professions Education (Sep 2019)

Guidelines for Creating Written Clinical Reasoning Exams: Insight from a Delphi Study

  • Évelyne Cambron-Goulet,
  • Jean-Pierre Dumas,
  • Édith Bergeron,
  • Linda Bergeron,
  • Christina St-Onge

Journal volume & issue
Vol. 5, no. 3
pp. 237 – 247

Abstract

Read online

Context: Clinical reasoning is an essential skill to be learned by medical students, and thus requires to be assessed. Although written exams are widely used as one of the tools to assess clinical reasoning, there are no specific guidelines to help an exam writer to develop good clinical reasoning assessment questions. Therefore, we conducted a modified Delphi study to identify guidelines for writing questions that assess clinical reasoning. Methods: Participants were identified from: 1) the literature on clinical reasoning (i.e., people who wrote about clinical reasoning and assessment), 2) the people responsible for assessment in Canadian medical faculties, and 3) a snowball sampling strategy. Thirty-two question-writing guidelines were drawn from the literature and adapted by the team members. Participants were asked to indicate on a ten-point Likert scale their perceived importance of each guideline, and, starting in the second round, the relevance of each guideline in five assessment contexts. A total of three rounds were conducted. Results: Response rates were 24%, 57%, and 62% for each round, respectively. Consensus about the importance of the guidelines (interquartile range < 2.5) was reached for all but four guidelines. Four guidelines were identified as important (median ≥ 9 on ten-point scale): the question should be based on a clinical case, the question represents a challenge achievable for the student, the correction scale (i.e., scoring grid) is explicit, and a panel of experts revises the questions. Conclusion: A large number of guidelines seem relevant for written-exam clinical reasoning assessment questions. We are considering grouping those guidelines into categories to create a simple tool for use by medical educators in the design of written-exam clinical reasoning assessment questions. The next step will then be to collect evidence of validity about this tool: Does it really help to build questions that assess clinical reasoning? Keywords: Clinical reasoning, Assessment, Written-exam questions, Guidelines, UGME