ETRI Journal (Nov 2021)

Simple and effective neural coreference resolution for Korean language

  • Cheoneum Park,
  • Joonho Lim,
  • Jihee Ryu,
  • Hyunki Kim,
  • Changki Lee

DOI
https://doi.org/10.4218/etrij.2020-0282
Journal volume & issue
Vol. 43, no. 6
pp. 1038 – 1048

Abstract

Read online

AbstractWe propose an end‐to‐end neural coreference resolution for the Korean language that uses an attention mechanism to point to the same entity. Because Korean is a head‐final language, we focused on a method that uses a pointer network based on the head. The key idea is to consider all nouns in the document as candidates based on the head‐final characteristics of the Korean language and learn distributions over the referenced entity positions for each noun. Given the recent success of applications using bidirectional encoder representation from transformer (BERT) in natural language‐processing tasks, we employed BERT in the proposed model to create word representations based on contextual information. The experimental results indicated that the proposed model achieved state‐of‐the‐art performance in Korean language coreference resolution.

Keywords