IEEE Access (Jan 2023)

Reproducible Searches in Systematic Reviews: An Evaluation and Guidelines

  • Zheng Li,
  • Austen Rainer

DOI
https://doi.org/10.1109/ACCESS.2023.3299211
Journal volume & issue
Vol. 11
pp. 84048 – 84060

Abstract

Read online

[Context:] The Systematic Review is promoted as a more reliable way of producing a high-quality review of prior research. But there are a range of threats that can undermine the reliability and quality of such reviews. One threat is the reproducibility of automated searches. [Objectives:] To evaluate the state-of-practice of reproducible searches in secondary studies, and to consider ways to improve the reproducibility of searches. [Method:] We re-run the searches of 621 secondary studies and analyse the outcomes of those (attempted) re-runs. We use the outcomes, and our experience of re-running the searches, to propose ways to improve the reproducibility of automated searches. [Results:] With the 621 studies, more than 50% of the literal search strings (ignoring other settings) are not reusable; about 87% of the searches (e.g., with settings) cannot be repeated; and around 94% of the searches (including all elements of the search) are irreproducible. We propose guidelines for automated search, directing particular attention at the formulation of search strings. [Conclusion:] While some aspects of automated search are beyond the direct control of researchers (e.g., variations in features, constraints and performance of search engines), many aspects can be effectively managed through more careful formulation and execution of the search strings themselves, and of the search settings. While the results of our evaluation are disappointing there are many simple, concrete steps that researchers can make to improve the reproducibility of their searches.

Keywords