Transactions of the Association for Computational Linguistics (Jan 2023)

Bridging the Gap between Synthetic and Natural Questions via Sentence Decomposition for Semantic Parsing

  • Yilin Niu,
  • Fei Huang,
  • Wei Liu,
  • Jianwei Cui,
  • Bin Wang,
  • Minlie Huang

DOI
https://doi.org/10.1162/tacl_a_00552
Journal volume & issue
Vol. 11
pp. 367 – 383

Abstract

Read online

AbstractSemantic parsing maps natural language questions into logical forms, which can be executed against a knowledge base for answers. In real-world applications, the performance of a parser is often limited by the lack of training data. To facilitate zero-shot learning, data synthesis has been widely studied to automatically generate paired questions and logical forms. However, data synthesis methods can hardly cover the diverse structures in natural languages, leading to a large gap in sentence structure between synthetic and natural questions. In this paper, we propose a decomposition-based method to unify the sentence structures of questions, which benefits the generalization to natural questions. Experiments demonstrate that our method significantly improves the semantic parser trained on synthetic data (+7.9% on KQA and +8.9% on ComplexWebQuestions in terms of exact match accuracy). Extensive analysis demonstrates that our method can better generalize to natural questions with novel text expressions compared with baselines. Besides semantic parsing, our idea potentially benefits other semantic understanding tasks by mitigating the distracting structure features. To illustrate this, we extend our method to the task of sentence embedding learning, and observe substantial improvements on sentence retrieval (+13.1% for Hit@1).