Electronic Research Archive (Jan 2023)
Learning capability of the rescaled pure greedy algorithm with non-iid sampling
Abstract
We consider the rescaled pure greedy learning algorithm (RPGLA) with the dependent samples drawn according to a non-identical sequence of probability distributions. The generalization performance is provided by applying the independent-blocks technique and adding the drift error. We derive the satisfactory learning rate for the algorithm under the assumption that the process satisfies stationary $ \beta $-mixing, and also find that the optimal rate $ O(n^{-1}) $ can be obtained for i.i.d. processes.
Keywords