A new sub-class linear discriminant for miniature spectrometer based food analysis

Omar Nibouche*, Fayas Asharindavida, Hui Wang, Jordan Vincent, Jun Liu, Saskia van Ruth, Paul Maguire, Enayet Rahman

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

37 Downloads (Pure)

Abstract

The well-known and extensively studied Linear Discriminant Analysis (LDA) can have its performance lowered in scenarios where data is not homoscedastic or not Gaussian. That is, the classical assumptions when LDA models are built are not applicable, and consequently LDA projections would not be able to extract the needed features to explain the intrinsic structure of data and for classes to be separated. As with many real word data sets, data obtained using miniature spectrometers can suffer from such drawbacks which would limit the deployment of such technology needed for food analysis. The solution presented in the paper is to divide classes into subclasses and to use means of sub classes, classes, and data in the suggested between classes scatter metric. Further, samples belonging to the same subclass are used to build a measure of within subclass scatterness. Such a solution solves the shortcoming of the classical LDA. The obtained results when using the proposed solution on food data and on general machine learning datasets show that the work in this paper compares well to and is very competitive with similar sub-class LDA algorithms in the literature. An extension to a Hilbert space is also presented; and the kernel version of the presented solution can be fused with its linear counter parts to yield improved classification rates.
Original languageEnglish
Article number105136
JournalChemometrics and Intelligent Laboratory Systems
Volume250
Early online date19 May 2024
DOIs
Publication statusPublished - Jul 2024

Fingerprint

Dive into the research topics of 'A new sub-class linear discriminant for miniature spectrometer based food analysis'. Together they form a unique fingerprint.

Cite this