Remote Sensing (Nov 2024)

FE-SKViT: A Feature-Enhanced ViT Model with Skip Attention for Automatic Modulation Recognition

  • Guangyao Zheng,
  • Bo Zang,
  • Penghui Yang,
  • Wenbo Zhang,
  • Bin Li

DOI
https://doi.org/10.3390/rs16224204
Journal volume & issue
Vol. 16, no. 22
p. 4204

Abstract

Read online

Automatic modulation recognition (AMR) is widely employed in communication systems. However, under conditions of low signal-to-noise ratio (SNR), recent studies reveal limitations in achieving high AMR accuracy. In this work, we introduce a novel network architecture that leverages a transformer-inspired approach tailored for AMR, called Feature-Enhanced Transformer with skip-attention (FE-SKViT). This innovative design adeptly harnesses the advantages of translation variant convolution and the Transformer framework, handling intra-signal variance and small cross-signal variance to achieve enhanced recognition accuracy. Experimental results on RadioML2016.10a, RadioML2016.10b, and RML22 datasets demonstrate that the Feature-Enhanced Transformer with skip-attention (FE-SKViT) excels over other methods, particularly under low SNR conditions ranging from −4 to 6 dB.

Keywords