International Journal of Computational Intelligence Systems (Nov 2023)

Lexical Normalization Using Generative Transformer Model (LN-GTM)

  • Mohamed Ashmawy,
  • Mohamed Waleed Fakhr,
  • Fahima A. Maghraby

DOI
https://doi.org/10.1007/s44196-023-00366-8
Journal volume & issue
Vol. 16, no. 1
pp. 1 – 11

Abstract

Read online

Abstract Lexical Normalization (LN) aims to normalize a nonstandard text to a standard text. This problem is of extreme importance in natural language processing (NLP) when applying existing trained models to user-generated text on social media. Users of social media tend to use non-standard language. They heavily use abbreviations, phonetic substitutions, and colloquial language. Nevertheless, most existing NLP-based systems are often designed with the standard language in mind. However, they suffer from significant performance drops due to the many out-of-vocabulary words found in social media text. In this paper, we present a new (LN) technique by utilizing a transformer-based sequence-to-sequence (Seq2Seq) to build a multilingual characters-to-words machine translation model. Unlike the majority of current methods, the proposed model is capable of recognizing and generating previously unseen words. Also, it greatly reduces the difficulties involved in tokenizing and preprocessing the nonstandard text input and the standard text output. The proposed model outperforms the winning entry to the Multilingual Lexical Normalization (MultiLexNorm) shared task at W-NUT 2021 on both intrinsic and extrinsic evaluations.

Keywords