IEEE Access (Jan 2019)
Tensor-Based Channel Estimation for Massive MIMO-OFDM Systems
Abstract
Channel estimation is a crucial problem for massive multiple input multiple output (MIMO) systems to achieve the expected benefits in terms of spectrum and energy efficiencies. However, a considerable number of pilots are usually distributed over a large number of time-frequency resources using orthogonal frequency division multiplexing (OFDM) to effectively estimate a large number of channel coefficients in space and frequency domains, sacrificing spectral efficiency. In this paper, by assuming MIMO-OFDM transmission, we start by proposing a tensor-based minimum mean square error (MMSE) channel estimator that exploits the multidimensional nature of the frequency-selective massive MIMO channel in the frequency-domain, being a low-complexity alternative to the well-known vector-MMSE channel estimation. Then, by incorporating a 3D sparse representation into the tensor-based channel model, a tensor compressive sensing (tensor-CS) model is formulated by assuming that the channel is compressively sampled in space (radio-frequency chains), time (symbol periods), and frequency (pilot subcarriers). This tensor-CS model is used as the basis for the formulation of a tensor-orthogonal matching-pursuit (T-OMP) estimator that solves a greedy problem per dimension of the measured tensor data. The proposed channel estimator has two variants which may either resort to a joint search per tensor dimension or to a sequential search that progressively reduces the search space across the tensor dimensions. The complexities of the different tensor-based algorithms are studied and compared to those of the traditional vector-MMSE and vector-CS estimators. Our results also corroborate the performance-complexity tradeoffs between T-MMSE and T-OMP estimators, both being competing alternatives to their vector-based MMSE and OMP counterparts.
Keywords