IEEE Access (Jan 2021)
A Graph Convolutional Network With Multiple Dependency Representations for Relation Extraction
Abstract
Dependency analysis can assist neural networks to capture semantic features within a sentence for entity relation extraction (RE). Both hard and soft strategies of encoding dependency tree structure have been developed to balance the beneficial extra information against the unfavorable interference in the task of RE. A wide application of graph convolutional network (GCN) in the field of natural language processing (NLP) has demonstrated its effectiveness in encoding the input sentence with the dependency tree structure, as well as its efficiency in parallel computation. This study proposes a novel GCN-based model using multiple representations to depict the dependency tree from various perspectives, and combines those dependency representations afterward to obtain a better sentence representation for relation classification. This model can maximally draw from the sentence the semantic features relevant to the relationship between entities. Results show that our model achieves state-of-the-art performance in terms of the F1 score (68.0) on the Text Analysis Conference relation extraction dataset (TACRED). In addition, we verify that the renormalization parameter in the GCN operation should be carefully chosen to help GCN-based models achieve its best performance.
Keywords