Building Support Vector Machines in the Context of Regularised Least Squares

Research output: Contribution to journalArticle

6 Citations (Scopus)
181 Downloads (Pure)

Abstract

This paper formulates a linear kernel support vector machine (SVM) as a regularized least-squares (RLS) problem. By defining a set of indicator variables of the errors, the solution to the RLS problem is represented as an equation that relates the error vector to the indicator variables. Through partitioning the training set, the SVM weights and bias are expressed analytically using the support vectors. It is also shown how this approach naturally extends to Sums with nonlinear kernels whilst avoiding the need to make use of Lagrange multipliers and duality theory. A fast iterative solution algorithm based on Cholesky decomposition with permutation of the support vectors is suggested as a solution method. The properties of our SVM formulation are analyzed and compared with standard SVMs using a simple example that can be illustrated graphically. The correctness and behavior of our solution (merely derived in the primal context of RLS) is demonstrated using a set of public benchmarking problems for both linear and nonlinear SVMs.
Original languageEnglish
Pages (from-to)1
Number of pages61
JournalNeurocomputing
Early online date08 Jun 2016
DOIs
Publication statusEarly online date - 08 Jun 2016

Fingerprint Dive into the research topics of 'Building Support Vector Machines in the Context of Regularised Least Squares'. Together they form a unique fingerprint.

Cite this