Back to Search Start Over

Hyperbolic Pre-Trained Language Model

Authors :
Chen, Weize
Han, Xu
Lin, Yankai
He, Kaichen
Xie, Ruobing
Zhou, Jie
Liu, Zhiyuan
Sun, Maosong
Source :
IEEE-ACM Transactions on Audio, Speech, and Language Processing; 2024, Vol. 32 Issue: 1 p3101-3112, 12p
Publication Year :
2024

Abstract

In recent years, we have witnessed significant improvements in pre-trained language models (PLM) brought about by the scaling of parameter sizes and data amounts. However, this also brings high computational and storage costs. In this paper, we present a new direction to improve PLMs without scaling parameters and data: adopting a geometric feature space that is more suitable for encoding the intrinsic structured features of text. Although text is generally considered unstructured data, it possesses rich intrinsic structured features that signify syntactic and semantic relationships. Leveraging these structured features is vital for text understanding. Given that structured features are better encoded in hyperbolic spaces than in the Euclidean spaces used by conventional PLMs, we propose that PLMs should operate entirely within hyperbolic spaces. Our experiments demonstrate the superiority of hyperbolic PLMs over Euclidean PLMs across a wide variety of tasks, using the same parameter and data settings. This suggests that altering the geometry of model representation is a promising direction for model enhancement.

Details

Language :
English
ISSN :
23299290
Volume :
32
Issue :
1
Database :
Supplemental Index
Journal :
IEEE-ACM Transactions on Audio, Speech, and Language Processing
Publication Type :
Periodical
Accession number :
ejs66691190
Full Text :
https://doi.org/10.1109/TASLP.2024.3407575