PLoS ONE (Jan 2024)

Methodological rigor and quality of reporting of clinical trials published with physical activity interventions: A report from the Strengthening the Evidence in Exercise Sciences Initiative (SEES Initiative).

  • Andresa Conrado Ignacio,
  • Nórton Luís Oliveira,
  • Larissa Xavier Neves da Silva,
  • Jayne Feter,
  • Angélica Trevisan De Nardi,
  • Lucas Helal,
  • Marcelo Rodrigues Dos Santos,
  • Douglas Dos Santos Soares,
  • Leony Morgana Galliano,
  • Tainá Silveira Alano,
  • Daniel Umpierre

DOI
https://doi.org/10.1371/journal.pone.0309087
Journal volume & issue
Vol. 19, no. 8
p. e0309087

Abstract

Read online

BackgroundThis study addresses the need for improved transparency and reproducibility in randomized clinical trials (RCTs) within the field of physical activity (PA) interventions. Despite efforts to promote these practices, there is limited evidence on the adherence to established reporting and methodological standards in published RCTs. The research, part of the Strengthening the Evidence in Exercise Sciences Initiative (SEES Initiative) in 2020, assessed the methodological standards and reporting quality of RCTs focusing on PA interventions.MethodsRCTs of PA advice or exercise interventions published in 2020 were selected. Monthly searches were conducted on PubMed/MEDLINE targeting six top-tier exercise science journals. Assessments were conducted by two independent authors, based on 44 items originally from CONSORT and TIDieR reporting guidelines. These items were divided into seven domains: transparency, completeness, participants, intervention, rigor methodology, outcomes and critical analysis. Descriptive analysis was performed using absolute and relative frequencies, and exploratory analysis was done by comparing proportions using the χ2 test (α = 0.05).ResultsOut of 1,766 RCTs evaluated for eligibility, 53 were included. The median adherence to recommended items across the studies was 30 (18-44) items in individual assessments. Notably, items demonstrating full adherence were related to intervention description, justification, outcome measurement, effect sizes, and statistical analysis. Conversely, the least reported item pertained to mentioning unplanned modifications during trials, appearing in only 11.3% of studies. Among the 53 RCTs, 67.9% reported having a registration, and these registered studies showed higher adherence to assessed items compared to non-registered ones.ConclusionsIn summary, while critical analysis aspects were more comprehensively described, aspects associated with transparency, such as protocol registrations/modifications and intervention descriptions, were reported suboptimally. The findings underscore the importance of promoting resources related to reporting quality and transparent research practices for investigators and editors in the exercise sciences discipline.