IEEE Access (Jan 2018)

Convergence Rate for <inline-formula> <tex-math notation="LaTeX">$l^{q}$ </tex-math></inline-formula>-Coefficient Regularized Regression With Non-i.i.d. Sampling

  • Qin Guo,
  • Peixin Ye,
  • Binlei Cai

DOI
https://doi.org/10.1109/ACCESS.2018.2817215
Journal volume & issue
Vol. 6
pp. 18804 – 18813

Abstract

Read online

Many learning algorithms use hypothesis spaces which are trained from samples, but little theoretical work has been devoted to the study of these algorithms. In this paper, we show that mathematical analysis for the kernel-based coefficient least squares for regression with lq-regularizer, 1 ≤ q ≤ 2, which is essentially different from that for algorithms with hypothesis spaces independent of the sample or depending only on the sample size. The error analysis was carried out under the assumption that the samples are drawn from a non-identical sequence of probability measures and satisfy the β-mixing condition. We use the drift error analysis and the independent-blocks technique to deal with the non-identical and dependent setting, respectively. When the sequence of marginal distributions converges exponentially fast in the dual of a Hölder space and the sampling process satisfies polynomially β-mixing, we obtain the capacity dependent error bounds of the algorithm. As a byproduct, we derive a significantly faster learning rate that can be arbitrarily close to the best rate O(m-1) for the independent and identical samples.

Keywords