An exploratory study of the impact of task selection strategies on worker performance in crowdsourcing microtasks
Banuqitah, Huda and Dunlop, Mark and Abulkhair, Maysoon and Terzis, Sotirios; Demartini, Gianluca and Gadiraju, Ujwal, eds. (2024) An exploratory study of the impact of task selection strategies on worker performance in crowdsourcing microtasks. In: Proceedings of the 12th AAAI Conference on Human Computation and Crowdsourcing. Proceedings of the AAAI Conference on Human Computation and Crowdsourcing . AAAI Press, USA, pp. 2-11. ISBN 9781577358930 (https://doi.org/10.1609/hcomp.v12i1.31595)
Preview |
Text.
Filename: Banuqitah-etal-AAAI-2024-the-impact-of-task-selection-strategies-on-worker-performance.pdf
Accepted Author Manuscript License: Strathprints license 1.0 Download (1MB)| Preview |
Abstract
In microtask crowdsourcing systems like Amazon Mechanical Turk (AMT) and Appen Figure-Eight, workers often employ task selection strategies, completing sequences of tasks to maximize earnings. While previous literature has explored the effects of sequential tasks with varying complexities of the same type, there is a lack of knowledge on the consequences when multiple types of tasks with similar levels of difficulty are performed. This study examines the impact of sequences of three frequently employed task types, namely image classification, text classification, and surveys, on workers' engagement, accuracy, and perceived workloads. In addition, we analyze the influence of workers' personality traits on their strategies for selecting tasks. Our study, which involved 558 participants using AMT, found that engaging in sequences of distinct task types had a detrimental effect on classification task engagement and accuracy. It also increases the perceived task load and the worker's frustration. Nevertheless, the precise order of tasks does not significantly impact these results. Moreover, we showed a slight association between personality traits and the workers' selection strategy for the tasks. The results offered valuable knowledge for designing an efficient and inclusive crowdsourcing platform.
ORCID iDs
Banuqitah, Huda, Dunlop, Mark ORCID: https://orcid.org/0000-0002-4593-1103, Abulkhair, Maysoon and Terzis, Sotirios ORCID: https://orcid.org/0000-0002-5061-9923; Demartini, Gianluca and Gadiraju, Ujwal-
-
Item type: Book Section ID code: 91613 Dates: DateEvent14 October 2024PublishedSubjects: Science > Mathematics > Electronic computers. Computer science > Other topics, A-Z > Human-computer interaction
Social Sciences > Commerce > Business > Industrial psychologyDepartment: Faculty of Science > Computer and Information Sciences Depositing user: Pure Administrator Date deposited: 18 Dec 2024 10:51 Last modified: 18 Dec 2024 10:51 URI: https://strathprints.strath.ac.uk/id/eprint/91613