MedEdPORTAL (Mar 2013)

Critical Synthesis Package: Medical Course Experience Questionnaire

  • John T.E. Richardson

DOI
https://doi.org/10.15766/mep_2374-8265.9353
Journal volume & issue
Vol. 9

Abstract

Read online

Abstract This Critical Synthesis Package contains: (1) a Critical Analysis of the psychometric properties and application to health sciences education for the Medical Course Experience Questionnaire (MCEQ), and (2) a copy of the MCEQ instrument developed by Tim Wilkinson, MD. The MCEQ is an 18-item instrument that was developed for obtaining feedback from graduates of medical schools. It is a self-completion instrument administered by postal survey. It is intended to complement more generic instruments in providing evidence for quality assurance purposes. It contains four scales: Clinical Practice, Becoming a Professional, Influences on Health Delivery, and Professional Support. Psychometric data come from a study of medical school graduates in New Zealand that achieved a response rate of only 30%. The responses were scored from 1 (strongly agree) to 5 (strongly disagree). The items themselves possess good content validity and face validity, and the scale scores demonstrated satisfactory internal consistency. However, the scoring procedure is not clearly described. For example, high scale scores were taken to reflect positive attitudes. This implies that the responses were actually reversed (i.e., that 1 was coded as 5 and vice versa) before the scale scores were calculated. It would also be sensible to calculate scale scores by averaging the item scores to allow for varying numbers of items in the scales. Regarding the four scales, the eigenvalues-greater-than-one rule is known to overestimate the true number of factors due to sampling effects. In fact, the use of parallel analysis shows that the first four eigenvalues to be expected from a purely random correlation matrix would be 1.22, 1.18, 1.15, and 1.12. Comparing these numbers with the obtained eigenvalues shown for this measure suggests that only three factors should have been extracted from the data set, rather than four. Given the poor response rate in the development study, it is difficult to generalize the findings, even within the same institution. The response rate might have been improved if the respondents thought that the findings would be made public. Further development work is needed if the MCEQ is to constitute a useful instrument for obtaining student feedback. In short, the MCEQ's items possess good content validity and face validity, but they probably constitute only three scales, not four. Further development work is needed if the MCEQ is to constitute a useful instrument for obtaining student feedback.

Keywords