IEEE Access (Jan 2022)

Parallel Recurrent Module With Inter-Layer Attention for Capturing Long-Range Feature Relationships

  • Eunseok Kim,
  • Jihwan Bae,
  • Inwook Shim

DOI
https://doi.org/10.1109/ACCESS.2022.3182492
Journal volume & issue
Vol. 10
pp. 61960 – 61969

Abstract

Read online

Capturing long-range feature relationships is becoming a central issue with regard to convolutional neural networks (CNNs). In particular, several recent end-to-end trainable attention modules have attempted to model spatial-channel relationships within a given layer. In this work, we focus instead on modeling relationships among visual information captured in different layers and propose a novel module, referred to as a Parallel Recurrent Module with Inter-layer Attention (PI module). The PI module exhibits several unique characteristics, including the ability to memorize information from earlier layers and ameliorate gradient vanishing, both of which are issues not addressed by existing attention modules. Furthermore, due to its easy-to-adopt structure also incurring negligible computational overheads, the module successfully extends to not only CNNs on regular grids but also to graph convolution networks, and even other attention modules. We demonstrate by extensive experiments that the PI module is cost-efficient yet effectively provides additional performance gains on multiple benchmarks in classification, detection, and segmentation tasks in the image domain and a segmentation task in the point cloud domain.

Keywords