PLoS ONE (Jan 2023)

Crowdsource authoring as a tool for enhancing the quality of competency assessments in healthcare professions.

  • Che-Wei Lin,
  • Daniel L Clinciu,
  • Daniel Salcedo,
  • Chih-Wei Huang,
  • Enoch Yi No Kang,
  • Yu-Chuan Jack Li

DOI
https://doi.org/10.1371/journal.pone.0278571
Journal volume & issue
Vol. 18, no. 11
p. e0278571

Abstract

Read online

The current Objective Structured Clinical Examination (OSCE) is complex, costly, and difficult to provide high-quality assessments. This pilot study employed a focus group and debugging stage to test the Crowdsource Authoring Assessment Tool (CAAT) for the creation and sharing of assessment tools used in editing and customizing, to match specific users' needs, and to provide higher-quality checklists. Competency assessment international experts (n = 50) were asked to 1) participate in and experience the CAAT system when editing their own checklist, 2) edit a urinary catheterization checklist using CAAT, and 3) complete a Technology Acceptance Model (TAM) questionnaire consisting of 14 items to evaluate its four domains. The study occurred between October 2018 and May 2019. The median time for developing a new checklist using the CAAT was 65.76 minutes whereas the traditional method required 167.90 minutes. The CAAT system enabled quicker checklist creation and editing regardless of the experience and native language of participants. Participants also expressed the CAAT enhanced checklist development with 96% of them willing to recommend this tool to others. The use of a crowdsource authoring tool as revealed by this study has efficiently reduced the time to almost a third it would take when using the traditional method. In addition, it allows collaborations to partake on a simple platform which also promotes contributions in checklist creation, editing, and rating.