IEEE Access (Jan 2023)

Integrating Heterogeneous Graphs Using Graph Transformer Encoder for Solving Math Word Problems

  • Soyun Shin,
  • Jaehui Park,
  • Moonwook Ryu

DOI
https://doi.org/10.1109/ACCESS.2023.3257571
Journal volume & issue
Vol. 11
pp. 27609 – 27619

Abstract

Read online

This paper introduces a novel method that integrates structural information with training deep neural models to solve math word problems. Prior works adopt the graph structure to represent rich information residing in the input sentences. However, they lack the consideration of different relation types between other parts of the sentences. To provide various types of structural information in a uniform way, we propose a graph transformer encoder to integrate heterogeneous graphs of various input representations. We developed two types of graph structures. First, the Dependency Graph maintains long-distance lexical dependency between words and quantities. Second, the Question Overlap Graph captures the gist within the problem body. The two graphs are encoded as a single graph for graph transformation. Experimental results show that our method produces competitive results compared to the baselines. Our model outperforms state-of-the-art models in Equation and Answer accuracy near three percent in SVAMP benchmark. Moreover, we discuss that integrating different types of textual characteristics may improve the quality of mathematical logic inference from natural language sentences.

Keywords