IEEE Access (Jan 2020)

Additive Angular Margin Loss in Deep Graph Neural Network Classifier for Learning Graph Edit Distance

  • Nadeem Iqbal Kajla,
  • Malik Muhammad Saad Missen,
  • Muhammad Muzzamil Luqman,
  • Mickael Coustaty,
  • Arif Mehmood,
  • Gyu Sang Choi

DOI
https://doi.org/10.1109/ACCESS.2020.3035886
Journal volume & issue
Vol. 8
pp. 201752 – 201761

Abstract

Read online

The recent success of graph neural networks (GNNs) in the area of pattern recognition (PR) has increased the interest of researchers to use these frameworks in non-euclidean structures. This non-euclidean structure includes graphs or manifolds that are called geometric deep learning (GDL). It has opened a new direction for researchers to deal with graphs using deep learning in document processing, outperforming conventional methods. We propose a Deep Graph Neural Network (DGNN) classifier-based on additive angular margin loss for the classification task in document analysis. Another contribution of this work is to investigate the performance of a DGNN as a classifier using different loss functions, which helps to minimize the loss for the document analysis problem. We compare additive angular margin loss, Cosine angular margin loss, and multiplicative angular margin loss. Furthermore, we give a comparison between the mentioned loss functions and the Softmax loss function. We also present the comparisons of results using different graph edit distance (GED) methods. Our quantitative results suggest, that by applying the additive angular marginal loss function makes more compact intra-class ability and increases the inter-class discrepancy which enhances the discriminating power of the DGNN. Enhancing the decision boundaries between the classes increase the intra-class compactness and inter-class discrimination power of the model.

Keywords