Back to Search Start Over

Parameter-efficient feature-based transfer for paraphrase identification.

Authors :
Liu, Xiaodong
Rzepka, Rafal
Araki, Kenji
Source :
Natural Language Engineering; Jul2023, Vol. 29 Issue 4, p1066-1096, 31p
Publication Year :
2023

Abstract

There are many types of approaches for Paraphrase Identification (PI), an NLP task of determining whether a sentence pair has equivalent semantics. Traditional approaches mainly consist of unsupervised learning and feature engineering, which are computationally inexpensive. However, their task performance is moderate nowadays. To seek a method that can preserve the low computational costs of traditional approaches but yield better task performance, we take an investigation into neural network-based transfer learning approaches. We discover that by improving the usage of parameters efficiently for feature-based transfer, our research goal can be accomplished. Regarding the improvement, we propose a pre-trained task-specific architecture. The fixed parameters of the pre-trained architecture can be shared by multiple classifiers with small additional parameters. As a result, the computational cost left involving parameter update is only generated from classifier-tuning: the features output from the architecture combined with lexical overlap features are fed into a single classifier for tuning. Furthermore, the pre-trained task-specific architecture can be applied to natural language inference and semantic textual similarity tasks as well. Such technical novelty leads to slight consumption of computational and memory resources for each task and is also conducive to power-efficient continual learning. The experimental results show that our proposed method is competitive with adapter-BERT (a parameter-efficient fine-tuning approach) over some tasks while consuming only 16% trainable parameters and saving 69-96% time for parameter update. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
13513249
Volume :
29
Issue :
4
Database :
Complementary Index
Journal :
Natural Language Engineering
Publication Type :
Academic Journal
Accession number :
165037031
Full Text :
https://doi.org/10.1017/S135132492200050X