BMJ Open (Nov 2021)

Joint international consensus statement on crowdsourcing challenge contests in health and medicine: results of a modified Delphi process

  • Weiming Tang,
  • Tiarney Ritchwood,
  • Joseph Tucker,
  • Larry Han,
  • Don Mathanga,
  • Phyllis Awor,
  • Suzanne Day,
  • Noel Juban,
  • Shufang Wei,
  • Huanyu Bao,
  • Randall John,
  • Eneyi Kpokiri,
  • Diana Castro-Arroyave,
  • Vibhu Ambil,
  • Yuan Xiong,
  • Emmanuela Oppong

DOI
https://doi.org/10.1136/bmjopen-2021-048699
Journal volume & issue
Vol. 11, no. 11

Abstract

Read online

Objectives To develop a consensus statement to provide advice on designing, implementing and evaluating crowdsourcing challenge contests in public health and medical contexts.Design Modified Delphi using three rounds of survey questionnaires and one consensus workshop.Setting Uganda for face-to-face consensus activities, global for online survey questionnaires.Participants A multidisciplinary expert panel was convened at a consensus-development conference in Uganda and included 21 researchers with experience leading challenge contests, five public health sector workers, and nine Ugandan end users. An online survey was sent to 140 corresponding authors of previously published articles that had used crowdsourcing methods.Results A subgroup of expert panel members developed the initial statement and survey. We received responses from 120 (85.7%) survey participants, which were presented at an in-person workshop of all 21 panel members. Panelists discussed each of the sections, revised the statement, and participated in a second round of the survey questionnaire. Based on this second survey round, we held detailed discussions of each subsection with workshop participants and further revised the consensus statement. We then conducted the third round of the questionnaire among the 21 expert panelists and used the results to finalize the statement. This iterative process resulted in 23 final statement items, all with greater than 80% consensus. Statement items are organised into the seven stages of a challenge contest, including the following: considering the appropriateness, organising a community steering committee, promoting the contest, assessing contributions, recognising contributors, sharing ideas and evaluating the contest (COPARSE).Conclusions There is high agreement among crowdsourcing experts and stakeholders on the design and implementation of crowdsourcing challenge contests. The COPARSE consensus statement can be used to organise crowdsourcing challenge contests, improve the rigour and reproducibility of crowdsourcing research and enable large-scale collaboration.