IEEE Access (Jan 2019)
Using Low-Rank Approximations to Speed Up Kernel Logistic Regression Algorithm
Abstract
Logistic regression as a classic classification algorithm has limitations that can only be applied to linearly separable data. For linearly indivisible data, we use a kernel trick to map it to a higher dimensional space, making it easier to separate and structure in this space. However, with the increasing scale of data, the use of kernel trick is becoming more and more restricted. When the data reaches a certain scale, the cost of storage and computing kernel matrix is very expensive. To mitigate the problem of kernel matrix overload, we employ the low-rank approximate kernel matrix to speed up the solution of kernel logistic regression (KLR) and propose a framework for quickly solving the KLR algorithm. We use a fast iterative algorithm similar to the sequential minimal optimization (SMO) algorithm to solve the dual problem in the KLR. In addition, in this framework, the low-rank approximation is combined with gradient descent and Newton iteration algorithm, respectively. The low-rank approximation is used to reduce the redundant information in the data, which not only speeds up the solution of the KLR but also improves the accuracy of classification. Finally, the extensive experiments show that the KLR optimization algorithms based on our proposed framework outperform the state-of-the-art algorithms.
Keywords