Fair Outlier Detection

Deepak P., Savitha Sam Abraham

Research output: Chapter in Book/Report/Conference proceedingConference contribution

9 Citations (Scopus)
151 Downloads (Pure)


An outlier detection method may be considered fair over specified sensitive attributes if the results of outlier detection are not skewed towards particular groups defined on such sensitive attributes. In this paper, we consider, for the first time to our best knowledge, the task of fair outlier detection. Our focus is on the task of fair outlier detection over multiple multi-valued sensitive attributes (e.g., gender, race, religion, nationality, marital status etc.), one that has broad applications across web data scenarios. We propose a fair outlier detection method, {\it FairLOF}, that is inspired by the popular {\it LOF} formulation for neighborhood-based outlier detection. We outline ways in which unfairness could be induced within {\it LOF} and develop three heuristic principles to enhance fairness, which form the basis of the {\it FairLOF} method. Being a novel task, we develop an evaluation framework for fair outlier detection, and use that to benchmark {\it FairLOF} on quality and fairness of results. Through an extensive empirical evaluation over real-world datasets, we illustrate that {\it FairLOF} is able to achieve significant improvements in fairness at sometimes marginal degradations on result quality as measured against the fairness-agnostic {\it LOF} method.
Original languageEnglish
Title of host publicationWISE: International Conference on Web Information Systems Engineering 2020
Publication statusPublished - 21 Oct 2020
Event21th International Conference on Web Information Systems Engineering: WISE 2020 -
Duration: 20 Oct 202024 Oct 2020

Publication series

NameLecture Notes in Computer Science
ISSN (Print)0302-9743


Conference21th International Conference on Web Information Systems Engineering
Internet address


Dive into the research topics of 'Fair Outlier Detection'. Together they form a unique fingerprint.

Cite this