Abstract
Emotion recognition (ER) and gender recognition (GR) through non-invasive sensors are highly useful in the assessment of psychological and physiological behavior. The purpose of this chapter is to examine whether the implicit behavioral cues found in electroencephalogram (EEG) signals as well as eye movements can be used to recognize gender (GR) and emotion (ER) from psychophysical behavior. The cues examined are obtained using inexpensive, off-the-shelf sensors. There were 28 users (14 males) who recognized Ekman's basic emotions from unoccluded faces (no mask) and partially occluded faces (eye or mouth masks); EEG responses encoded gender-specific differences, while eye movements were indicative of the perception of facial emotions. The use of convolutional neural networks and AdaBoost for classification demonstrates (a) that with EEG and eye characteristics, reliable GR (peak area under the ROC curve (AUC) of 0.97) and ER (peak AUC of 0.99) are feasible, (b) females exhibit differential cognitive processing of negative emotions based on event-related potential patterns, and (c) gender differences in eye gaze are observed under partial face occlusions, such as eye and mouth masks.
Original language | English |
---|---|
Title of host publication | Affective Computing Applications using Artificial Intelligence in Healthcare |
Subtitle of host publication | Methods, approaches and challenges in system design |
Publisher | Institution of Engineering and Technology |
Pages | 39-65 |
Number of pages | 27 |
ISBN (Electronic) | 9781839537325 |
ISBN (Print) | 9781839537318 |
DOIs | |
Publication status | Published - 1 Jan 2024 |
Keywords
- Electroencephalography
- Emotional face perception
- Eye gaze tracking
- Gender and emotion recognition
- Implicit user behavior
- Unoccluded and occluded faces