1. PSRMTE: Paper submission recommendation using mixtures of transformer.
- Author
-
Nguyen, Dac Huu, Huynh, Son Thanh, Dinh, Cuong Viet, Huynh, Phong Tan, and Nguyen, Binh Thanh
- Subjects
- *
COMPUTATIONAL mathematics , *RECOMMENDER systems , *MACHINE learning , *COMPUTER science , *ELECTRONIC journals , *APPLIED mathematics - Abstract
Nowadays, there has been a rapidly increasing number of scientific submissions in multiple research domains. A large number of journals have various acceptance rates, impact factors, and rankings in different publishers. It becomes time-consuming for many researchers to select the most suitable journal to submit their work with the highest acceptance rate. A paper submission recommendation system is more critical for the research community and publishers as it gives scientists another support to complete their submission conveniently. This paper investigates the submission recommendation system for two main research topics: computer science and applied mathematics. Unlike the previous works (Wang et al., 2018; Son et al., 2020) that extract TF–IDF and statistical features as well as utilize numerous machine learning algorithms (logistics regression and multiple perceptrons) for building the recommendation engine, we present an efficient paper submission recommendation algorithm by using different bidirectional transformer encoders and the Mixture of Transformer Encoders technique. We compare the performance between our methodology and other approaches by one dataset from Wang et al. (2018) with 14012 papers in computer science and another dataset collected by us with 223,782 articles in 178 Springer applied mathematics journals in terms of top K accuracy (K = 1 , 3 , 5 , 10). The experimental results show that our proposed method extensively outperforms other state-of-the-art techniques with a significant margin in all top K accuracy for both two datasets. We publish all datasets collected and our implementation codes for further references. 1 1 https://github.com/BinhMisfit/PSRMTE. • Bidirectional transformer encoders can improve the performance of the paper submission recommendation system. • The Mixture of Transformer Encoders framework shows the efficiency in the paper submission recommendation problem. • Proposed techniques can surpass other recent techniques on two datasets related. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF