Transactions of the Association for Computational Linguistics (Jan 2022)

Getting BART to Ride the Idiomatic Train: Learning to Represent Idiomatic Expressions

  • Ziheng Zeng,
  • Suma Bhat

DOI
https://doi.org/10.1162/tacl_a_00510
Journal volume & issue
Vol. 10
pp. 1120 – 1137

Abstract

Read online

AbstractIdiomatic expressions (IEs), characterized by their non-compositionality, are an important part of natural language. They have been a classical challenge to NLP, including pre-trained language models that drive today’s state-of-the-art. Prior work has identified deficiencies in their contextualized representation stemming from the underlying compositional paradigm of representation. In this work, we take a first-principles approach to build idiomaticity into BART using an adapter as a lightweight non-compositional language expert trained on idiomatic sentences. The improved capability over baselines (e.g., BART) is seen via intrinsic and extrinsic methods, where idiom embeddings score 0.19 points higher in homogeneity score for embedding clustering, and up to 25% higher sequence accuracy on the idiom processing tasks of IE sense disambiguation and span detection.