Back to Search
Start Over
A framework for evaluation of crowdsourcing platforms performance.
- Source :
- Information Development; Nov2024, Vol. 40 Issue 4, p635-647, 13p
- Publication Year :
- 2024
-
Abstract
- This study aims to identify an appropriate conceptual framework to evaluate crowdsourcing platforms from an open innovation perspective employing a combination of qualitative and quantitative methods. The initial indices of the performance evaluation framework in the crowdsourcing platforms are obtained through the Delphi method and interviews with experts. Then, using these factors, a statistical questionnaire is designed and distributed among users of crowdsourcing platforms to confirm or reject the factors. Finally, the aspects of the performance evaluation framework of crowdsourcing platforms are specified from the perspective of open innovation. Using fuzzy hierarchical analysis, these aspects are prioritized in order of importance: Collaboration, Project design, Moderation, Terms and conditions, UI/UX (user interface and user experience), and Key statistics. Concerning the principle of crowdsourcing, which is based on crowd participation and crowd intelligence of users, Collaboration and Project design turned out to be the significant factors in evaluating a crowdsourcing platform. [ABSTRACT FROM AUTHOR]
- Subjects :
- USER interfaces
CROWDSOURCING
OPEN innovation
USER experience
QUESTIONNAIRES
Subjects
Details
- Language :
- English
- ISSN :
- 02666669
- Volume :
- 40
- Issue :
- 4
- Database :
- Complementary Index
- Journal :
- Information Development
- Publication Type :
- Academic Journal
- Accession number :
- 179940858
- Full Text :
- https://doi.org/10.1177/02666669231152553