IEEE Access (Jan 2018)

Error Analysis of Least-Squares <inline-formula> <tex-math notation="LaTeX">$l^{^{q}}$ </tex-math></inline-formula>-Regularized Regression Learning Algorithm With the Non-Identical and Dependent Samples

  • Qin Guo,
  • Peixin Ye

DOI
https://doi.org/10.1109/ACCESS.2018.2863600
Journal volume & issue
Vol. 6
pp. 43824 – 43829

Abstract

Read online

The selection of the penalty functional is critical for the performance of a regularized learning algorithm, and thus lq-regularizer (1 ≤ q ≤ 2) deserves special attention. We consider the regularized least-squares regression learning algorithm for the non-identical and weakly dependent samples. The dependent samples satisfy the polynomially β-mixing condition and the sequence of the non-identical sampling marginal measures converges to a probability measure exponentially in the dual of a Hölder space. We conduct the rigorous unified error analysis and derive the satisfactory learning rates of the algorithm by the stepping stone technique in the error decomposition and the independent-blocks technique in the sample error estimates.

Keywords