IEEE Access (Jan 2024)

Meta-Transformer: A Meta-Learning Framework for Scalable Automatic Modulation Classification

  • Jungik Jang,
  • Jisung Pyo,
  • Young-Il Yoon,
  • Jaehyuk Choi

DOI
https://doi.org/10.1109/ACCESS.2024.3352634
Journal volume & issue
Vol. 12
pp. 9267 – 9276

Abstract

Read online

Recent advances in deep learning (DL) have led many contemporary automatic modulation classification (AMC) techniques to use deep networks in classifying the modulation type of incoming signals at the receiver. However, current DL-based methods face scalability challenges, particularly when encountering unseen modulations or input signals from environments not present during model training, making them less suitable for real-world applications like software-defined radio devices. In this paper, we introduce a scalable AMC scheme that provides flexibility for new modulations and adaptability to input signals with diverse configurations. We propose the Meta-Transformer, a meta-learning framework based on few-shot learning (FSL) to acquire general knowledge and a learning method for AMC tasks. This approach empowers the model to identify new unseen modulations using only a very small number of samples, eliminating the need for complete model retraining. Furthermore, we enhance the scalability of the classifier by leveraging main-sub transformer-based encoders, enabling efficient processing of input signals with diverse setups. Extensive evaluations demonstrate that the proposed AMC method outperforms existing techniques across all signal-to-noise ratios (SNRs) on RadioML2018.01A. The source code and pre-trained models are released at https://github.com/cheeseBG/meta-transformer-amc.

Keywords