IEEE Access (Jan 2019)

Equivalence of Joint ML-Decoding and Separate MMSE-ML Decoding for Training-Based MIMO Systems

  • Peng Pan,
  • Haiquan Wang,
  • Lei Shen,
  • Conghui Lu

DOI
https://doi.org/10.1109/ACCESS.2019.2958700
Journal volume & issue
Vol. 7
pp. 178862 – 178869

Abstract

Read online

In order to gain insights into the potential and behavior of training-based MIMO systems, the relationship of joint decoding scheme and separate decoding scheme is considered, and the equivalence between these two decoding schemes is proved. For the considered joint decoding scheme, receiver decodes out data by joint processing of received signals of both training symbols and data symbols in the maximum likelihood (ML) sense. We refer it as the joint ML-decoder. By contrast, in the considered separate decoding scheme, receiver first estimates channels in the minimum mean-square-errors (MMSE) sense, and then, based on the estimated channel information and the received signals of data symbols, the receiver decodes out data in the ML sense. We refer it as the separate MMSE-ML decoder. Notice that this separate MMSE-ML decoder is different from the decoder appeared in the most of existing works, where a kind of mismatched ML decoding is used after the phase of channel estimation. Although the above-mentioned decoding schemes have different decoding procedures, we prove that the joint ML-decoder and the separate MMSE-ML decoder are equivalent, while they outperform the mismatched ML decoder. With this equivalence, it is implied that the MMSE channel estimator is optimal when the overall system performance is considered, and the separate MMSE-ML decoding scheme can achieve the same performance with that of joint ML decoding scheme. Furthermore, this equivalence also provides us another way to analyze the system performance of the considered separate or joint decoding scheme. Namely, the results obtained from the considered joint decoding scheme can be directly applied to the considered separate decoding scheme, and vice versa.

Keywords