Abstract
The tension between the increasing need for fact-checking and the limited capacity of fact-check providers inspired several crowdsourced approaches to address this challenge. However, little is known about how effectively crowdsourced fact-checking might perform in the real world at a large scale. We fill this gap by evaluating a Taiwanese crowdsourced fact-checking community and two professional fact-checking sites from four dimensions: Variety, Velocity, Veracity, and Viability. Our analysis shows the different focus these two types of sites have in terms of topic coverage (variety) and demonstrates that while crowdsourced fact-checkers are much faster than professionals (velocity) to answer new requests, these fact-checkers often build on the existing professional knowledge for repeated requests. In addition, our findings indicate that the accuracy of the crowdsourced community (veracity) parallels that of the professional sources; and that the crowdsourced fact-checks are perceived quite close to professionals in terms of objectivity, clarity, and persuasiveness (viability).
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
Copyright (c) 2023 Journal of Online Trust and Safety