Back to Search Start Over

SChuBERT: Scholarly Document Chunks with BERT-encoding boost Citation Count Prediction

Authors :
van Dongen, Thomas
Maillette de Buy Wenniger, Gideon
Schomaker, Lambert
Chandrasekaran, Muthu Kumar
Artificial Intelligence
Source :
Proceedings of the First Workshop on Scholarly Document Processing, 148-157, STARTPAGE=148;ENDPAGE=157;TITLE=Proceedings of the First Workshop on Scholarly Document Processing
Publication Year :
2020
Publisher :
Association for Computational Linguistics (ACL), 2020.

Abstract

Predicting the number of citations of scholarly documents is an upcoming task in scholarly document processing. Besides the intrinsic merit of this information, it also has a wider use as an imperfect proxy for quality which has the advantage of being cheaply available for large volumes of scholarly documents. Previous work has dealt with number of citations prediction with relatively small training data sets, or larger datasets but with short, incomplete input text. In this work we leverage the open access ACL Anthology collection in combination with the Semantic Scholar bibliometric database to create a large corpus of scholarly documents with associated citation information and we propose a new citation prediction model called SChuBERT. In our experiments we compare SChuBERT with several state-of-the-art citation prediction models and show that it outperforms previous methods by a large margin. We also show the merit of using more training data and longer input for number of citations prediction.<br />Comment: Published at the First Workshop on Scholarly Document Processing, at EMNLP 2020. Minor corrections were made to the workshop version, including addition of color to Figures 1,2

Details

Language :
English
Database :
OpenAIRE
Journal :
Proceedings of the First Workshop on Scholarly Document Processing, 148-157, STARTPAGE=148;ENDPAGE=157;TITLE=Proceedings of the First Workshop on Scholarly Document Processing
Accession number :
edsair.doi.dedup.....7d08eaaeacf729c1f3653e245e297382