EURASIP Journal on Wireless Communications and Networking (Nov 2023)

Modified state activation functions of deep learning-based SC-FDMA channel equalization system

  • Mohamed A. Mohamed,
  • Hassan A. Hassan,
  • Mohamed H. Essai,
  • Hamada Esmaiel,
  • Ahmed S. Mubarak,
  • Osama A. Omer

DOI
https://doi.org/10.1186/s13638-023-02326-4
Journal volume & issue
Vol. 2023, no. 1
pp. 1 – 26

Abstract

Read online

Abstract The most important function of the deep learning (DL) channel equalization and symbol detection systems is the ability to predict the user’s original transmitted data. Generally, the behavior and performance of the deep artificial neural networks (DANNs) rely on three main aspects: the network structure, the learning algorithms, and the activation functions (AFs) used in each node in the network. Long short-term memory (LSTM) recurrent neural networks have shown some success in channel equalization and symbol detection. The AFs used in the DANN play a significant role in how the learning algorithms converge. Our article shows how modifying the AFs used in the tanh units (block input and output) of the LSTM units can significantly boost the DL equalizer's performance. Additionally, the learning process of the DL model was optimized with the help of two distinct error-measuring functions: default (cross-entropy) and sum of squared error (SSE). The DL model's performance with different AFs is compared. This comparison is conducted using three distinct learning algorithms: Adam, RMSProp, and SGdm. The findings clearly demonstrate that the most frequently used AFs (sigmoid and hyperbolic tangent functions) do not really make a significant contribution to perfect network behaviors in channel equalization. On the other hand, there are a lot of non-common AFs that can outperform the frequently employed ones. Furthermore, the outcomes demonstrate that the recommended loss functions (SSE) exhibit superior performance in addressing the channel equalization challenge compared to the default loss functions (cross-entropy).

Keywords