Fast Kernel Generalized Discriminative Common Vectors for Feature Extraction

Katerine Diaz-Chito, Jesus Martinez del Rincon, Aura Hernandez-Sabate, Marçal Rusiñol, Francesc J. Ferri

Research output: Contribution to journalArticle

1 Citation (Scopus)
182 Downloads (Pure)

Abstract

This paper presents a supervised subspace learning method called Kernel Generalized Discriminative Common Vectors (KGDCV), as a novel extension of the known Discriminative Common Vectors method with Kernels. Our method combines the advantages of kernel methods to model complex data and solve nonlinear problems with moderate computational complexity, with the better generalization properties of generalized approaches for large dimensional data. These attractive combination makes KGDCV specially suited for feature extraction and classification in computer vision, image processing and pattern recognition applications. Two different approaches to this generalization are proposed: a first one based on the Kernel Trick and a second one based on the Nonlinear Projection Trick (NPT) for even higher efficiency. Both methodologies have been validated on four different image datasets containing faces, objects and handwritten digits and compared against well-known nonlinear state-of-the-art methods. Results show better discriminant properties than other generalized approaches both linear or kernel. In addition, the KGDCV-NPT approach presents a considerable computational gain, without compromising the accuracy of the model.
Original languageEnglish
Pages (from-to)512-524
Number of pages13
JournalJournal of Mathematical Imaging and Vision
Volume60
Issue number4
Early online date24 Oct 2017
DOIs
Publication statusPublished - May 2018

Fingerprint Dive into the research topics of 'Fast Kernel Generalized Discriminative Common Vectors for Feature Extraction'. Together they form a unique fingerprint.

  • Cite this