1. Measuring the Effectiveness of Gamesourcing Expert Oil Painting Annotations
- Author
-
Traub, Myriam, Ossenbruggen, Jacco, He, Jiyin, Hardman, Lynda, Rijke, Maarten, Kentner, T, Vries, Arjen, Jong, Franciska, Zhai, ChengXiang, Hofmann, Katja, Radinsky, K., and Human-Centered Data Analytics
- Abstract
Tasks that require users to have expert knowledge are diffi- cult to crowdsource. They are mostly too complex to be carried out by non-experts and the available experts in the crowd are difficult to target. Adapting an expert task into a non-expert user task, thereby enabling the ordinary “crowd” to accomplish it, can be a useful approach. We studied whether a simplified version of an expert annotation task can be carried out by non-expert users. Users conducted a game-style annota- tion task of oil paintings. The obtained annotations were compared with those from experts. Our results show a significant agreement between the annotations done by experts and non-experts, that users improve over time and that the aggregation of users’ annotations per painting increases their precision.
- Published
- 2014