TY - JOUR
T1 - Crowdsource authoring as a tool for enhancing the quality of competency assessments in healthcare professions
AU - Lin, Che Wei
AU - Clinciu, Daniel L.
AU - Salcedo, Daniel
AU - Huang, Chih Wei
AU - No Kang, Enoch Yi
AU - Li, Yu Chuan Jack
N1 - Publisher Copyright:
Copyright: © 2023 Lin et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
PY - 2023/11
Y1 - 2023/11
N2 - The current Objective Structured Clinical Examination (OSCE) is complex, costly, and difficult to provide high-quality assessments. This pilot study employed a focus group and debugging stage to test the Crowdsource Authoring Assessment Tool (CAAT) for the creation and sharing of assessment tools used in editing and customizing, to match specific users’ needs, and to provide higher-quality checklists. Competency assessment international experts (n = 50) were asked to 1) participate in and experience the CAAT system when editing their own checklist, 2) edit a urinary catheterization checklist using CAAT, and 3) complete a Technology Acceptance Model (TAM) questionnaire consisting of 14 items to evaluate its four domains. The study occurred between October 2018 and May 2019. The median time for developing a new checklist using the CAAT was 65.76 minutes whereas the traditional method required 167.90 minutes. The CAAT system enabled quicker checklist creation and editing regardless of the experience and native language of participants. Participants also expressed the CAAT enhanced checklist development with 96% of them willing to recommend this tool to others. The use of a crowdsource authoring tool as revealed by this study has efficiently reduced the time to almost a third it would take when using the traditional method. In addition, it allows collaborations to partake on a simple platform which also promotes contributions in checklist creation, editing, and rating.
AB - The current Objective Structured Clinical Examination (OSCE) is complex, costly, and difficult to provide high-quality assessments. This pilot study employed a focus group and debugging stage to test the Crowdsource Authoring Assessment Tool (CAAT) for the creation and sharing of assessment tools used in editing and customizing, to match specific users’ needs, and to provide higher-quality checklists. Competency assessment international experts (n = 50) were asked to 1) participate in and experience the CAAT system when editing their own checklist, 2) edit a urinary catheterization checklist using CAAT, and 3) complete a Technology Acceptance Model (TAM) questionnaire consisting of 14 items to evaluate its four domains. The study occurred between October 2018 and May 2019. The median time for developing a new checklist using the CAAT was 65.76 minutes whereas the traditional method required 167.90 minutes. The CAAT system enabled quicker checklist creation and editing regardless of the experience and native language of participants. Participants also expressed the CAAT enhanced checklist development with 96% of them willing to recommend this tool to others. The use of a crowdsource authoring tool as revealed by this study has efficiently reduced the time to almost a third it would take when using the traditional method. In addition, it allows collaborations to partake on a simple platform which also promotes contributions in checklist creation, editing, and rating.
UR - http://www.scopus.com/inward/record.url?scp=85175786026&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85175786026&partnerID=8YFLogxK
U2 - 10.1371/journal.pone.0278571
DO - 10.1371/journal.pone.0278571
M3 - Article
C2 - 37917751
AN - SCOPUS:85175786026
SN - 1932-6203
VL - 18
JO - PLoS ONE
JF - PLoS ONE
IS - 11 NOVEMBER
M1 - e0278571
ER -