Back to Search
Start Over
gMLP-KGE: a simple but efficient MLPs with gating architecture for link prediction.
- Source :
- Applied Intelligence; Oct2024, Vol. 54 Issue 19, p9594-9606, 13p
- Publication Year :
- 2024
-
Abstract
- Most existing knowledge graphs (KGs) suffer from incompleteness, which will be detrimental to a variety of downstream applications. Link prediction is the task of predicting missing links in the KGs and can effectively address the issue of incompleteness by knowledge graph embedding (KGE). ConvE, a relatively popular KGE model based on convolutional neural networks, has shown superiority in link prediction. Some subsequent extension models of ConvE achieve state-of-the-art performance by increasing complexity and training time, which result in a high risk of overfitting and a limited performance due to the large number of parameters concentrated in the fully connected projection layer. To address these challenges, we for the first time innovatively introduce and extend a recently simple network architecture gMLP (based on multi-layer perceptrons MLPs with gating) in vision applications for link prediction. We propose a simple and efficient model called gMLP-KGE, which consists of an embedding layer, an input layer, an extended gMLP layer, and an output layer. Extensive experiments show that the number of parameters of gMLP-KGE is close to that of ConvE and less than other extension models, while gMLP-KGE consistently performs well on seven datasets of different scales under most evaluation metrics. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 0924669X
- Volume :
- 54
- Issue :
- 19
- Database :
- Complementary Index
- Journal :
- Applied Intelligence
- Publication Type :
- Academic Journal
- Accession number :
- 179041550
- Full Text :
- https://doi.org/10.1007/s10489-024-05677-7