PLoS ONE (Jan 2015)

An assessment of the methodological quality of published network meta-analyses: a systematic review.

  • James D Chambers,
  • Huseyin Naci,
  • Olivier J Wouters,
  • Junhee Pyo,
  • Shalak Gunjal,
  • Ian R Kennedy,
  • Mark G Hoey,
  • Aaron Winn,
  • Peter J Neumann

DOI
https://doi.org/10.1371/journal.pone.0121715
Journal volume & issue
Vol. 10, no. 4
p. e0121715

Abstract

Read online

To assess the methodological quality of published network meta-analysis.Systematic review.We searched the medical literature for network meta-analyses of pharmaceuticals. We assessed general study characteristics, study transparency and reproducibility, methodological approach, and reporting of findings. We compared studies published in journals with lower impact factors with those published in journals with higher impact factors, studies published prior to January 1st, 2013 with those published after that date, and studies supported financially by industry with those supported by non-profit institutions or that received no support.The systematic literature search identified 854 citations. Three hundred and eighteen studies met our inclusion criteria. The number of network meta-analyses has grown rapidly, with 48% of studies published since January 2013. The majority of network meta-analyses were supported by a non-profit institution or received no support (68%). We found considerable inconsistencies among reviewed studies. Eighty percent reported search terms, 61% a network diagram, 65% sufficient data to replicate the analysis, and 90% the characteristics of included trials. Seventy percent performed a risk of bias assessment of included trials, 40% an assessment of model fit, and 56% a sensitivity analysis. Among studies with a closed loop, 69% examined the consistency of direct and indirect evidence. Sixty-four percent of studies presented the full matrix of head-to-head treatment comparisons. For Bayesian studies, 41% reported the probability that each treatment was best, 31% reported treatment ranking, and 16% included the model code or referenced publicly-available code. Network meta-analyses published in higher impact factors journals and those that did not receive industry support performed better across the assessment criteria. We found few differences between older and newer studies.There is substantial variation in the network meta-analysis literature. Consensus among guidelines is needed improve the methodological quality, transparency, and consistency of study conduct and reporting.