Gender Classification via Lips: Static and Dynamic Features

Darryl Stewart, Adrian Pass, Jianguo Zhang

Research output: Contribution to journalArticle

9 Citations (Scopus)
487 Downloads (Pure)

Abstract

Automatic gender classification has many security and commercial applications. Various modalities have been investigated for gender classification with face-based classification being the most popular. In some real-world scenarios the face may be partially occluded. In these circumstances a classification based on individual parts of the face known as local features must be adopted. We investigate gender classification using lip movements. We show for the first time that important gender specific information can be obtained from the way in which a person moves their lips during speech. Furthermore our study indicates that the lip dynamics during speech provide greater gender discriminative information than simply lip appearance. We also show that the lip dynamics and appearance contain complementary gender information such that a model which captures both traits gives the highest overall classification result. We use Discrete Cosine Transform based features and Gaussian Mixture Modelling to model lip appearance and dynamics and employ the XM2VTS database for our experiments. Our experiments show that a model which captures lip dynamics along with appearance can improve gender classification rates by between 16-21% compared to models of only lip appearance.
Original languageEnglish
Pages (from-to)28-34
JournalIET Biometrics
Volume2
Issue number1
DOIs
Publication statusPublished - Mar 2013

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition

Fingerprint Dive into the research topics of 'Gender Classification via Lips: Static and Dynamic Features'. Together they form a unique fingerprint.

Cite this