Back to Search Start Over

Least Squares Support Vector Machine for Minimizing VC Dimensional Expectation Upper Bound.

Authors :
LI Hao
WANG Shitong
Source :
Journal of Frontiers of Computer Science & Technology; Jul2023, Vol. 17 Issue 7, p1599-1608, 10p
Publication Year :
2023

Abstract

Machine learning is an important aspect of modern computer technology, and the support vector machine method has been widely used in all walks of life because of its good performance in recent years. The performance of support vector machine can be measured by VC (Vapnik-Chervonenkis) dimension, which is an index to measure the complexity of the machine. In theory, low VC dimension can be well generalized. However, for some classifier methods based on traditional support vector machine, the upper bound of VC dimension of support vector machine may be infinite when dealing with a variety of data. Although good results have been obtained in practice and application, it can not guarantee good generalization, resulting in poor results for some special data. Therefore, this paper proposes an improved LSSVM (least squares support vector machines) algorithm. Based on LSSVM algorithm, this paper minimizes the upper bound of VC dimension and finds the desired best projection scheme. Finally, this paper brings it into LSSVM algorithm to classify the data. Experimental results show that the error rate of the classifier is lower than that of the traditional least squares support vector machine on the benchmark datase which means that the proposed algorithm makes the test accuracy better than the comparison algorithm with the approximate number of support vectors, and improves the generalization ability of the algorithm. [ABSTRACT FROM AUTHOR]

Details

Language :
Chinese
ISSN :
16739418
Volume :
17
Issue :
7
Database :
Complementary Index
Journal :
Journal of Frontiers of Computer Science & Technology
Publication Type :
Academic Journal
Accession number :
165054252
Full Text :
https://doi.org/10.3778/j.issn.1673-9418.2112108