IEEE Access (Jan 2025)

SECNN: Squeeze-and-Excitation Convolutional Neural Network for Sentence Classification

  • Shandong Yuan,
  • Zili Zou,
  • Han Zhou,
  • Yun Ren,
  • Jianping Wu,
  • Kai Yan

DOI
https://doi.org/10.1109/ACCESS.2025.3548111
Journal volume & issue
Vol. 13
pp. 42858 – 42865

Abstract

Read online

Sentence classification constitutes a fundamental task in natural language processing. Convolutional Neural Networks (CNNs) have gained prominence in this domain due to their capacity to extract n-gram features through parallel convolutional filters, effectively capturing local lexical correlations. However, due to the constrained receptive field of convolutional operations, conventional CNNs exhibit limitations in modeling long-range contextual dependencies. To address this, attention mechanisms âĂŞ which enable global contextual modeling and keyword saliency detection âĂŞ have been integrated with CNN architectures to enhance classification performance. Diverging from conventional approaches that emphasize lexical-level attention, this study introduces a novel Squeeze-and-Excitation Convolutional Neural Network (SECNN) that implements channel-wise attention on CNN feature maps. Specifically, SECNN aggregates multi-scale convolutional features as distinct semantic channels and employs Squeeze-and-Excitation (SE) blocks to learn channel-wise attention weights, thereby enabling dynamic feature recalibration based on inter-channel dependencies. Across MR, IMDb, AGNews and DBpedia benchmark datasets, the proposed model achieves marginal yet consistent improvements (0.2% F1 on MR; 0.1% on DBpedia) over baseline methods, suggesting statistically advantages in two of four evaluated tasks.

Keywords