Environment International (Aug 2020)
A novel study evaluation strategy in the systematic review of animal toxicology studies for human health assessments of environmental chemicals
Abstract
A key aspect of the systematic review process is study evaluation to understand the strengths and weaknesses of individual studies included in the review. The present manuscript describes the process currently being used by the Environmental Protection Agency’s (EPA) Integrated Risk Information System (IRIS) Program to evaluate animal toxicity studies, illustrated by application to the recent systematic reviews of two phthalates: diisobutyl phthalate (DIBP) and diethyl phthalate (DEP). The IRIS Program uses a domain-based approach that was developed after careful consideration of tools used by others to evaluate experimental animal studies in toxicology and pre-clinical research. Standard practice is to have studies evaluated by at least two independent reviewers for aspects related to reporting quality, risk of bias/internal validity (e.g., randomization, blinding at outcome assessment, methods used to expose animals and assess outcomes, etc.), and sensitivity to identify factors that may limit the ability of a study to detect a true effect. To promote consistency across raters, prompting considerations and example responses are provided to reviewers, and a pilot phase is conducted. The evaluation process is performed separately for each outcome reported in a study, as the utility of a study may vary for different outcomes. Input from subject matter experts is used to identify chemical- and outcome-specific considerations (e.g., lifestage of exposure and outcome assessment when considering reproductive effects) to guide judgments within particular evaluation domains. For each evaluation domain, reviewers reach a consensus on a rating of Good, Adequate, Deficient, or Critically Deficient. These individual domain ratings are then used to determine the overall confidence in the study (High Confidence, Medium Confidence, Low Confidence, or Deficient). Study evaluation results, including the justifications for reviewer judgements, are documented and made publicly available in EPA’s version of Health Assessment Workspace Collaborative (HAWC), a free and open source web-based software application. (The views expressed are those of the authors and do not necessarily represent the views or policies of the US EPA).