Abstract
With the broader adoption of virtual reality (VR), objective physiological measurements to automatically assess a user’s emotional state are gaining importance. Emotions affect human behavior, perception, cognition, and decision-making. Their recognition allows analysis of VR experiences and enables systems to react to and interact with a user’s emotions. Facial expressions are one of the most potent and natural signals to recognize emotions. Automatic facial expression recognition (FER) typically relies on facial images. However, users wear head-mounted displays (HMDs) in immersive VR environments, which occlude almost the entire upper half of the face. That severely limits the capabilities of conventional FER methods. We address this emerging challenge with our systematic literature review. To our knowledge, it is the first review on FER in immersive VR scenarios where HMDs partially occlude a user’s face. We identified 256 related works and included 21 for detailed analysis. Our review provides a comprehensive overview of the state-of-the-art and draws conclusions for future research.
Original language | English |
---|---|
Title of host publication | PETRA '23 |
Subtitle of host publication | Proceedings of the 16th International Conference on PErvasive Technologies Related to Assistive Environments |
Place of Publication | New York, NY |
Publisher | Association for Computing Machinery |
Pages | 77–82 |
Number of pages | 6 |
ISBN (Print) | 9798400700699 |
DOIs | |
Publication status | Published - 10 Aug 2023 |
Keywords
- facial expression recognition
- emotion recognition
- virtual reality
- head-mounted display
- affective computing
- systematic literature review