Implementation Science Communications (Jan 2023)

Sustainment of diverse evidence-informed practices disseminated in the Veterans Health Administration (VHA): initial development and piloting of a pragmatic survey tool

  • Caitlin M. Reardon,
  • Laura Damschroder,
  • Marilla A. Opra Widerquist,
  • Maria Arasim,
  • George L. Jackson,
  • Brandolyn White,
  • Sarah L. Cutrona,
  • Gemmae M. Fix,
  • Allen L. Gifford,
  • Kathryn DeLaughter,
  • Heather A. King,
  • Blake Henderson,
  • Ryan Vega,
  • Andrea L. Nevedal

DOI
https://doi.org/10.1186/s43058-022-00386-z
Journal volume & issue
Vol. 4, no. 1
pp. 1 – 14

Abstract

Read online

Abstract Background There are challenges associated with measuring sustainment of evidence-informed practices (EIPs). First, the terms sustainability and sustainment are often falsely conflated: sustainability assesses the likelihood of an EIP being in use in the future while sustainment assesses the extent to which an EIP is (or is not) in use. Second, grant funding often ends before sustainment can be assessed. The Veterans Health Administration (VHA) Diffusion of Excellence (DoE) program is one of few large-scale models of diffusion; it seeks to identify and disseminate practices across the VHA system. The DoE sponsors “Shark Tank” competitions, in which leaders bid on the opportunity to implement a practice with approximately 6 months of implementation support. As part of an ongoing evaluation of the DoE, we sought to develop and pilot a pragmatic survey tool to assess sustainment of DoE practices. Methods In June 2020, surveys were sent to 64 facilities that were part of the DoE evaluation. We began analysis by comparing alignment of quantitative and qualitative responses; some facility representatives reported in the open-text box of the survey that their practice was on a temporary hold due to COVID-19 but answered the primary outcome question differently. As a result, the team reclassified the primary outcome of these facilities to Sustained: Temporary COVID-Hold. Following this reclassification, the number and percent of facilities in each category was calculated. We used directed content analysis, guided by the Consolidated Framework for Implementation Research (CFIR), to analyze open-text box responses. Results A representative from forty-one facilities (64%) completed the survey. Among responding facilities, 29/41 sustained their practice, 1/41 partially sustained their practice, 8/41 had not sustained their practice, and 3/41 had never implemented their practice. Sustainment rates increased between Cohorts 1–4. Conclusions The initial development and piloting of our pragmatic survey allowed us to assess sustainment of DoE practices. Planned updates to the survey will enable flexibility in assessing sustainment and its determinants at any phase after adoption. This assessment approach can flex with the longitudinal and dynamic nature of sustainment, including capturing nuances in outcomes when practices are on a temporary hold. If additional piloting illustrates the survey is useful, we plan to assess the reliability and validity of this measure for broader use in the field.

Keywords