PLoS ONE (Jan 2016)

Using the Web to Collect Data on Sensitive Behaviours: A Study Looking at Mode Effects on the British National Survey of Sexual Attitudes and Lifestyles.

  • Sarah Burkill,
  • Andrew Copas,
  • Mick P Couper,
  • Soazig Clifton,
  • Philip Prah,
  • Jessica Datta,
  • Frederick Conrad,
  • Kaye Wellings,
  • Anne M Johnson,
  • Bob Erens

DOI
https://doi.org/10.1371/journal.pone.0147983
Journal volume & issue
Vol. 11, no. 2
p. e0147983

Abstract

Read online

BACKGROUND:Interviewer-administered surveys are an important method of collecting population-level epidemiological data, but suffer from declining response rates and increasing costs. Web surveys offer more rapid data collection and lower costs. There are concerns, however, about data quality from web surveys. Previous research has largely focused on selection biases, and few have explored measurement differences. This paper aims to assess the extent to which mode affects the responses given by the same respondents at two points in time, providing information on potential measurement error if web surveys are used in the future. METHODS:527 participants from the third British National Survey of Sexual Attitudes and Lifestyles (Natsal-3), which uses computer assisted personal interview (CAPI) and self-interview (CASI) modes, subsequently responded to identically-worded questions in a web survey. McNemar tests assessed whether within-person differences in responses were at random or indicated a mode effect, i.e. higher reporting of more sensitive responses in one mode. An analysis of pooled responses by generalized estimating equations addressed the impact of gender and question type on change. RESULTS:Only 10% of responses changed between surveys. However mode effects were found for about a third of variables, with higher reporting of sensitive responses more commonly found on the web compared with Natsal-3. CONCLUSIONS:The web appears a promising mode for surveys of sensitive behaviours, most likely as part of a mixed-mode design. Our findings suggest that mode effects may vary by question type and content, and by the particular mix of modes used. Mixed-mode surveys need careful development to understand mode effects and how to account for them.