Back to Search Start Over

Additive Angular Margin Loss in Deep Graph Neural Network Classifier for Learning Graph Edit Distance

Authors :
Nadeem Iqbal Kajla
Malik Muhammad Saad Missen
Muhammad Muzzamil Luqman
Mickael Coustaty
Arif Mehmood
Gyu Sang Choi
Source :
IEEE Access, Vol 8, Pp 201752-201761 (2020)
Publication Year :
2020
Publisher :
Institute of Electrical and Electronics Engineers (IEEE), 2020.

Abstract

The recent success of graph neural networks (GNNs) in the area of pattern recognition (PR) has increased the interest of researchers to use these frameworks in non-euclidean structures. This non-euclidean structure includes graphs or manifolds that are called geometric deep learning (GDL). It has opened a new direction for researchers to deal with graphs using deep learning in document processing, outperforming conventional methods. We propose a Deep Graph Neural Network (DGNN) classifier-based on additive angular margin loss for the classification task in document analysis. Another contribution of this work is to investigate the performance of a DGNN as a classifier using different loss functions, which helps to minimize the loss for the document analysis problem. We compare additive angular margin loss, Cosine angular margin loss, and multiplicative angular margin loss. Furthermore, we give a comparison between the mentioned loss functions and the Softmax loss function. We also present the comparisons of results using different graph edit distance (GED) methods. Our quantitative results suggest, that by applying the additive angular marginal loss function makes more compact intra-class ability and increases the inter-class discrepancy which enhances the discriminating power of the DGNN. Enhancing the decision boundaries between the classes increase the intra-class compactness and inter-class discrimination power of the model.

Details

ISSN :
21693536
Volume :
8
Database :
OpenAIRE
Journal :
IEEE Access
Accession number :
edsair.doi.dedup.....efb164304c6396e9df2dc310b0fb947a