Lattice Machine Classification based on Contextual Probability

Hui Wang, Ivo Düntsch, Luis Trindade

Research output: Contribution to journalArticlepeer-review

3 Citations (Scopus)


In this paper we review Lattice Machine, a learning paradigm that “learns” by generalising data in a consistent, conservative and parsimonious way, and has the advantage of being able to provide additional reliability information for any classification. More specifically, we review the related concepts such as hyper tuple and hyper relation, the three generalising criteria (equilabelledness, maximality, and supportedness) as well as the modelling and classifying algorithms. In an attempt to find a better method for classification in Lattice Machine, we consider the contextual probability which was originally proposed as a measure for approximate reasoning when there is insufficient data. It was later found to be a probability function that has the same classification ability as the data generating probability called primary probability. It was also found to be an alternative way of estimating the primary probability without much model assumption. Consequently, a contextual probability based Bayes classifier can be designed. In this paper we present a new classifier that utilises the Lattice Machine model and generalises the contextual probability based Bayes classifier. We interpret the model as a dense set of data points in the data space and then apply the contextual probability based Bayes classifier. A theorem is presented that allows efficient estimation of the contextual probability based on this interpretation. The proposed classifier is illustrated by examples.
Original languageEnglish
Pages (from-to)241-256
Number of pages16
JournalFundamenta Informaticae
Issue number1-4
Publication statusPublished - 2013
Externally publishedYes


Dive into the research topics of 'Lattice Machine Classification based on Contextual Probability'. Together they form a unique fingerprint.

Cite this