Jisuanji kexue yu tansuo (Jul 2023)

Least Squares Support Vector Machine for Minimizing VC Dimensional Expectation Upper Bound

  • LI Hao, WANG Shitong

DOI
https://doi.org/10.3778/j.issn.1673-9418.2112108
Journal volume & issue
Vol. 17, no. 7
pp. 1599 – 1608

Abstract

Read online

Machine learning is an important aspect of modern computer technology, and the support vector machine method has been widely used in all walks of life because of its good performance in recent years. The performance of support vector machine can be measured by VC (Vapnik-Chervonenkis) dimension, which is an index to measure the complexity of the machine. In theory, low VC dimension can be well generalized. However, for some classifier methods based on traditional support vector machine, the upper bound of VC dimension of support vector machine may be infinite when dealing with a variety of data. Although good results have been obtained in practice and application, it can not guarantee good generalization, resulting in poor results for some special data. Therefore, this paper proposes an improved LSSVM (least squares support vector machines) algorithm. Based on LSSVM algorithm, this paper minimizes the upper bound of VC dimension and finds the desired best projection scheme. Finally, this paper brings it into LSSVM algorithm to classify the data. Experimental results show that the error rate of the classifier is lower than that of the traditional least squares support vector machine on the benchmark dataset, which means that the proposed algorithm makes the test accuracy better than the comparison algorithm with the approximate number of support vectors, and improves the generalization ability of the algorithm.

Keywords