Journal of Medical Education and Curricular Development (Jul 2024)

Impact of Performing Medical Writing/Publishing Workshops: A Systematic Survey and Meta-Analysis

  • Behrooz Astaneh,
  • Ream Abdullah,
  • Vala Astaneh,
  • Sana Gupta,
  • Hadi Raeisi Shahraki,
  • Aminreza Asadollahifar,
  • Gordon Guaytt

DOI
https://doi.org/10.1177/23821205241269378
Journal volume & issue
Vol. 11

Abstract

Read online

Objectives Proficiency in medical writing and publishing is essential for medical researchers. Workshops can play a valuable role in addressing these issues. However, there is a lack of systematic summaries of evidence on the evaluation of their impacts. So, in this systematic review, we aimed to evaluate all articles published on the impact of such workshops worldwide. Methods We searched Ovid EMBASE, Ovid Medline, ISI Web of Science, ERIC database, and grey literature with no language, time period, or geographical location limitations. Randomized controlled trials, cohort studies, before-after studies, surveys, and program evaluation and development studies were included. We performed a meta-analysis on data related to knowledge increase after the workshops and descriptively reported the evaluation of other articles that did not have sufficient data for a meta-analysis. All analyses were performed using Stata software, version 15.0. Results Of 23 040 reports, 222 articles underwent full-text review, leading to 45 articles reporting the impacts of workshops. Overall, the reports on the impact of such workshops were incomplete or lacked the necessary precision to draw acceptable conclusions. The workshops were sporadic, and researchers used their own method of assessment. Meta-analyses of the impact on the knowledge showed that workshops could nonsignificantly increase the mean or percentage of participants’ knowledge. Conclusion In the absence of systematic academic courses on medical writing/publishing, workshops are conducted worldwide; however, reports on educational activities during such workshops, the methods of presentations, and their curricula are incomplete and vary. Their impact is not evaluated using standardized methods, and no valid and reliable measurement tools have been employed for these assessments.