Improved conjugate gradient implementation for least squares support vector machines

B. Li, S.J. Song, Kang Li

Research output: Contribution to journalArticlepeer-review

13 Citations (Scopus)


As a promising method for pattern recognition and function estimation, least squares support vector machines (LS-SVM) express the training in terms of solving a linear system instead of a quadratic programming problem as for conventional support vector machines (SVM). In this paper, by using the information provided by the equality constraint, we transform the minimization problem with a single equality constraint in LS-SVM into an unconstrained minimization problem, then propose reduced formulations for LS-SVM. By introducing this transformation, the times of using conjugate gradient (CG) method, which is a greatly time-consuming step in obtaining the numerical solution, are reduced to one instead of two as proposed by Suykens et al. (1999). The comparison on computational speed of our method with the CG method proposed by Suykens et al. and the first order and second order SMO methods on several benchmark data sets shows a reduction of training time by up to 44%. (C) 2011 Elsevier B.V. All rights reserved.
Original languageEnglish
Pages (from-to)121-125
Number of pages5
JournalPattern Recognition Letters
Issue number2
Publication statusPublished - 15 Jan 2012

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence
  • Computer Vision and Pattern Recognition
  • Signal Processing


Dive into the research topics of 'Improved conjugate gradient implementation for least squares support vector machines'. Together they form a unique fingerprint.

Cite this