PLoS ONE (Jan 2021)

BERTtoCNN: Similarity-preserving enhanced knowledge distillation for stance detection.

  • Yang Li,
  • Yuqing Sun,
  • Nana Zhu

DOI
https://doi.org/10.1371/journal.pone.0257130
Journal volume & issue
Vol. 16, no. 9
p. e0257130

Abstract

Read online

In recent years, text sentiment analysis has attracted wide attention, and promoted the rise and development of stance detection research. The purpose of stance detection is to determine the author's stance (favor or against) towards a specific target or proposition in the text. Pre-trained language models like BERT have been proven to perform well in this task. However, in many reality scenes, they are usually very expensive in computation, because such heavy models are difficult to implement with limited resources. To improve the efficiency while ensuring the performance, we propose a knowledge distillation model BERTtoCNN, which combines the classic distillation loss and similarity-preserving loss in a joint knowledge distillation framework. On the one hand, BERTtoCNN provides an efficient distillation process to train a novel 'student' CNN structure from a much larger 'teacher' language model BERT. On the other hand, based on the similarity-preserving loss function, BERTtoCNN guides the training of a student network, so that input pairs with similar (dissimilar) activation in the teacher network have similar (dissimilar) activation in the student network. We conduct experiments and test the proposed model on the open Chinese and English stance detection datasets. The experimental results show that our model outperforms the competitive baseline methods obviously.