The advancement in technology indicates that there is an opportunity to enhance human–computer interaction by way of affective state recognition. Affective state recognition is typically based on passive stimuli such as watching video clips, which does not reflect genuine interaction. This paper presents a study on affective state recognition using active stimuli, i.e. facial expressions of users when they attempt computerised tasks, particularly across typical usage of computer systems. A data collection experiment is presented for acquiring data from normal users whilst they interact with software, attempting to complete a set of predefined tasks. In addition, a hierarchical machine learning approach is presented for facial expression-based affective state recognition, which employs an Euclidean distance-based feature representation, conjointly with a customised encoding for users’ self-reported affective states. Consequently, the aim is to find the potential relationship between the facial expressions, as de ned by Paul Ekman, and the self-reported emotional states specified by users using Russells Circumplex model, in relation to the actual feelings and affective states. The main findings of this study suggest that facial expressions cannot precisely reveal the actual feelings of users whilst interacting with common computerised tasks. Moreover, during active interaction tasks more variation occurs within the facial expressions of participants than occurs within passive interaction.
|Journal||Journal of Ambient Intelligence and Humanized Computing|
|Early online date||04 Dec 2017|
|Publication status||Published - 01 Jun 2019|
- Human Computer Interaction
- Facial Expression