BMJ Open (Feb 2024)

Tools for assessing quality of studies investigating health interventions using real-world data: a literature review and content analysis

  • Aukje K Mantel-Teeuwisse,
  • Rick A Vreman,
  • Olaf H Klungel,
  • Junfeng Wang,
  • Li Jiu,
  • Michiel Hartog,
  • Wim G Goettsch

DOI
https://doi.org/10.1136/bmjopen-2023-075173
Journal volume & issue
Vol. 14, no. 2

Abstract

Read online

Objectives We aimed to identify existing appraisal tools for non-randomised studies of interventions (NRSIs) and to compare the criteria that the tools provide at the quality-item level.Design Literature review through three approaches: systematic search of journal articles, snowballing search of reviews on appraisal tools and grey literature search on websites of health technology assessment (HTA) agencies.Data sources Systematic search: Medline; Snowballing: starting from three articles (D’Andrea et al, Quigley et al and Faria et al); Grey literature: websites of European HTA agencies listed by the International Network of Agencies for Health Technology Assessment. Appraisal tools were searched through April 2022.Eligibility criteria for selecting studies We included a tool, if it addressed quality concerns of NRSIs and was published in English (unless from grey literature). A tool was excluded, if it was only for diagnostic, prognostic, qualitative or secondary studies.Data extraction and synthesis Two independent researchers searched, screened and reviewed all included studies and tools, summarised quality items and scored whether and to what extent a quality item was described by a tool, for either methodological quality or reporting.Results Forty-nine tools met inclusion criteria and were included for the content analysis. Concerns regarding the quality of NRSI were categorised into 4 domains and 26 items. The Research Triangle Institute Item Bank (RTI Item Bank) and STrengthening the Reporting of OBservational studies in Epidemiology (STROBE) were the most comprehensive tools for methodological quality and reporting, respectively, as they addressed (n=20; 17) and sufficiently described (n=18; 13) the highest number of items. However, none of the tools covered all items.Conclusion Most of the tools have their own strengths, but none of them could address all quality concerns relevant to NRSIs. Even the most comprehensive tools can be complemented by several items. We suggest decision-makers, researchers and tool developers consider the quality-item level heterogeneity, when selecting a tool or identifying a research gap.OSF registration number OSF registration DOI (https://doi.org/10.17605/OSF.IO/KCSGX).