Training Sparse Least Squares Support Vector Machines by the QR Decomposition

Xiao-Lei Xia

Research output: Contribution to journalArticle

4 Citations (Scopus)
115 Downloads (Pure)

Abstract

The solution of an LS-SVM has suffered from the problem of non-sparseness. The paper proposed to apply the KMP algorithm, with the number of support vectors as the regularization parameter, to tackle the non-sparseness problem of LS-SVMs. The idea of the kernel matching pursuit (KMP) algorithm was first revisited from the perspective of the QR decomposition of the kernel matrix on the training set. Strategies are further developed to select those support vectors which minimizes the leave-one-out cross validation error of the resultant sparse LS-SVM model. It is demonstrated that the LOOCV of the sparse LS-SVM can be computed accurately and efficiently. Experimental results on benchmark datasets showed that, compared to the SVM and variants sparse LS-SVM models, the proposed sparse LS-SVM models developed upon KMP algorithms maintained comparable performance in terms of both accuracy and sparsity.
Original languageEnglish
JournalNeural Networks
Early online date19 Jul 2018
DOIs
Publication statusEarly online date - 19 Jul 2018

Fingerprint Dive into the research topics of 'Training Sparse Least Squares Support Vector Machines by the QR Decomposition'. Together they form a unique fingerprint.

  • Cite this