Journal of Inequalities and Applications (Sep 2018)
Error analysis for lq $l^{q}$-coefficient regularized moving least-square regression
Abstract
Abstract We consider the moving least-square (MLS) method by the coefficient-based regression framework with lq $l^{q}$-regularizer (1≤q≤2) $(1\leq q\leq2)$ and the sample dependent hypothesis spaces. The data dependent characteristic of the new algorithm provides flexibility and adaptivity for MLS. We carry out a rigorous error analysis by using the stepping stone technique in the error decomposition. The concentration technique with the l2 $l^{2}$-empirical covering number is also employed in our study to improve the sample error. We derive the satisfactory learning rate that can be arbitrarily close to the best rate O(m−1) $O(m^{-1})$ under more natural and much simpler conditions.
Keywords