1. Researchers' judgment criteria of high-quality answers on academic social Q&A platforms
- Author
-
Daqing He, Jia Tina Du, Lei Li, Chengzhi Zhang, Li, Lei, Zhang, Chengzhi, He, Daqing, and Du, Jia Tina
- Subjects
Medical education ,social media ,academic answer quality ,academic social Q&A ,05 social sciences ,academic social networking ,02 engineering and technology ,Library and Information Sciences ,Computer Science Applications ,Scholarship ,ResearchGate Q&A ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Social media ,0509 other social sciences ,050904 information & library sciences ,Completeness (statistics) ,Psychology ,judgment criteria ,Information Systems - Abstract
PurposeThrough a two-stage survey, this paper examines how researchers judge the quality of answers on ResearchGate Q&A, an academic social networking site.Design/methodology/approachIn the first-stage survey, 15 researchers from Library and Information Science (LIS) judged the quality of 157 answers to 15 questions and reported the criteria that they had used. The content of their reports was analyzed, and the results were merged with relevant criteria from the literature to form the second-stage survey questionnaire. This questionnaire was then completed by researchers recognized as accomplished at identifying high-quality LIS answers on ResearchGate Q&A.FindingsMost of the identified quality criteria for academic answers—such as relevance, completeness, and verifiability—have previously been found applicable to generic answers. The authors also found other criteria, such as comprehensiveness, the answerer's scholarship, and value-added. Providing opinions was found to be the most important criterion, followed by completeness and value-added.Originality/valueThe findings here show the importance of studying the quality of answers on academic social Q&A platforms and reveal unique considerations for the design of such systems.
- Published
- 2020