International Journal of Antennas and Propagation (Jan 2012)

Efficient Rank-Adaptive Least-Square Estimation and Multiple-Parameter Linear Regression Using Novel Dyadically Recursive Hermitian Matrix Inversion

  • Hsiao-Chun Wu,
  • Shih Yu Chang,
  • Tho Le-Ngoc,
  • Yiyan Wu

DOI
https://doi.org/10.1155/2012/891932
Journal volume & issue
Vol. 2012

Abstract

Read online

Least-square estimation (LSE) and multiple-parameter linear regression (MLR) are the important estimation techniques for engineering and science, especially in the mobile communications and signal processing applications. The majority of computational complexity incurred in LSE and MLR arises from a Hermitian matrix inversion. In practice, the Yule-Walker equations are not valid, and hence the Levinson-Durbin algorithm cannot be employed for general LSE and MLR problems. Therefore, the most efficient Hermitian matrix inversion method is based on the Cholesky factorization. In this paper, we derive a new dyadic recursion algorithm for sequential rank-adaptive Hermitian matrix inversions. In addition, we provide the theoretical computational complexity analyses to compare our new dyadic recursion scheme and the conventional Cholesky factorization. We can design a variable model-order LSE (MLR) using this proposed dyadic recursion approach thereupon. Through our complexity analyses and the Monte Carlo simulations, we show that our new dyadic recursion algorithm is more efficient than the conventional Cholesky factorization for the sequential rank-adaptive LSE (MLR) and the associated variable model-order LSE (MLR) can seek the trade-off between the targeted estimation performance and the required computational complexity. Our proposed new scheme can benefit future portable and mobile signal processing or communications devices.