Gender Classification via Lips: Static and Dynamic Features

Darryl Stewart, Adrian Pass, Jianguo Zhang

Research output: Contribution to journalArticle

7 Citations (Scopus)
417 Downloads (Pure)

Abstract

Automatic gender classification has many security and commercial applications. Various modalities have been investigated for gender classification with face-based classification being the most popular. In some real-world scenarios the face may be partially occluded. In these circumstances a classification based on individual parts of the face known as local features must be adopted. We investigate gender classification using lip movements. We show for the first time that important gender specific information can be obtained from the way in which a person moves their lips during speech. Furthermore our study indicates that the lip dynamics during speech provide greater gender discriminative information than simply lip appearance. We also show that the lip dynamics and appearance contain complementary gender information such that a model which captures both traits gives the highest overall classification result. We use Discrete Cosine Transform based features and Gaussian Mixture Modelling to model lip appearance and dynamics and employ the XM2VTS database for our experiments. Our experiments show that a model which captures lip dynamics along with appearance can improve gender classification rates by between 16-21% compared to models of only lip appearance.
Original languageEnglish
Pages (from-to)28-34
JournalIET Biometrics
Volume2
Issue number1
DOIs
Publication statusPublished - Mar 2013

Fingerprint

Discrete cosine transforms
Experiments

Cite this

Stewart, Darryl ; Pass, Adrian ; Zhang, Jianguo. / Gender Classification via Lips: Static and Dynamic Features. In: IET Biometrics. 2013 ; Vol. 2, No. 1. pp. 28-34.
@article{ffdaabeff6e545a6a77c03f9c7ff6830,
title = "Gender Classification via Lips: Static and Dynamic Features",
abstract = "Automatic gender classification has many security and commercial applications. Various modalities have been investigated for gender classification with face-based classification being the most popular. In some real-world scenarios the face may be partially occluded. In these circumstances a classification based on individual parts of the face known as local features must be adopted. We investigate gender classification using lip movements. We show for the first time that important gender specific information can be obtained from the way in which a person moves their lips during speech. Furthermore our study indicates that the lip dynamics during speech provide greater gender discriminative information than simply lip appearance. We also show that the lip dynamics and appearance contain complementary gender information such that a model which captures both traits gives the highest overall classification result. We use Discrete Cosine Transform based features and Gaussian Mixture Modelling to model lip appearance and dynamics and employ the XM2VTS database for our experiments. Our experiments show that a model which captures lip dynamics along with appearance can improve gender classification rates by between 16-21{\%} compared to models of only lip appearance.",
author = "Darryl Stewart and Adrian Pass and Jianguo Zhang",
year = "2013",
month = "3",
doi = "10.1049/iet-bmt.2012.0021",
language = "English",
volume = "2",
pages = "28--34",
journal = "IET Biometrics",
issn = "2047-4938",
publisher = "The Institution of Engineering and Technology",
number = "1",

}

Gender Classification via Lips: Static and Dynamic Features. / Stewart, Darryl; Pass, Adrian; Zhang, Jianguo.

In: IET Biometrics, Vol. 2, No. 1, 03.2013, p. 28-34.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Gender Classification via Lips: Static and Dynamic Features

AU - Stewart, Darryl

AU - Pass, Adrian

AU - Zhang, Jianguo

PY - 2013/3

Y1 - 2013/3

N2 - Automatic gender classification has many security and commercial applications. Various modalities have been investigated for gender classification with face-based classification being the most popular. In some real-world scenarios the face may be partially occluded. In these circumstances a classification based on individual parts of the face known as local features must be adopted. We investigate gender classification using lip movements. We show for the first time that important gender specific information can be obtained from the way in which a person moves their lips during speech. Furthermore our study indicates that the lip dynamics during speech provide greater gender discriminative information than simply lip appearance. We also show that the lip dynamics and appearance contain complementary gender information such that a model which captures both traits gives the highest overall classification result. We use Discrete Cosine Transform based features and Gaussian Mixture Modelling to model lip appearance and dynamics and employ the XM2VTS database for our experiments. Our experiments show that a model which captures lip dynamics along with appearance can improve gender classification rates by between 16-21% compared to models of only lip appearance.

AB - Automatic gender classification has many security and commercial applications. Various modalities have been investigated for gender classification with face-based classification being the most popular. In some real-world scenarios the face may be partially occluded. In these circumstances a classification based on individual parts of the face known as local features must be adopted. We investigate gender classification using lip movements. We show for the first time that important gender specific information can be obtained from the way in which a person moves their lips during speech. Furthermore our study indicates that the lip dynamics during speech provide greater gender discriminative information than simply lip appearance. We also show that the lip dynamics and appearance contain complementary gender information such that a model which captures both traits gives the highest overall classification result. We use Discrete Cosine Transform based features and Gaussian Mixture Modelling to model lip appearance and dynamics and employ the XM2VTS database for our experiments. Our experiments show that a model which captures lip dynamics along with appearance can improve gender classification rates by between 16-21% compared to models of only lip appearance.

U2 - 10.1049/iet-bmt.2012.0021

DO - 10.1049/iet-bmt.2012.0021

M3 - Article

VL - 2

SP - 28

EP - 34

JO - IET Biometrics

JF - IET Biometrics

SN - 2047-4938

IS - 1

ER -