TY - GEN
T1 - Towards emotion recognition in immersive virtual environments
T2 - 2nd International Conference on Computer Science's Complex Systems and their Applications, ICCSA 2021
AU - Amara, Kahina
AU - Ramzan, Naeem
AU - Zenati, Nadia
AU - Djekoune, Oualid
AU - Larbes, Cherif
AU - Guerroudji, Mohamed Amine
AU - Aouam, Djamel
PY - 2021/5/25
Y1 - 2021/5/25
N2 - Virtual Reality (VR) is, thus, proposed as a powerful tool to simulate complex, real situations and environments, offering researchers unprecedented opportunities to investigate human behaviour in closely controlled designs in controlled laboratory conditions. Facial emotion recognition has attracted a great deal of interest for interaction in virtual reality, healthcare system: Therapeutic applications, surveillance video application etc. In this paper, we propose a method for facial emotion recognition for immersive virtual environment based on 2D and 3D geometrical features. We used our collected dataset of 17 subjects' performance of six basic facial emotions (anger, fear, happiness, surprise, sadness, and neutral) using three devices: Kinect (v1), Kinect (v2), and RGB HD camera. In addition, we present the performance results of the RGB data for facial emotion recognition using Bagged Trees algorithm. To assess the performance of the proposed system, we used leave-one-out-subject cross-validation. We compared the 2D and 3D data performance for facial expression recognition. The obtained results show the superior performance of the RGB-D features provided by Kinect (v2). Our findings highlight that the 2D images are not robust enough for facial emotion recognition. The built facial emotion models will animate virtual characters that can express emotions via facial expressions. This could be deployed for Chatting, Learning and Therapeutic Intervention.
AB - Virtual Reality (VR) is, thus, proposed as a powerful tool to simulate complex, real situations and environments, offering researchers unprecedented opportunities to investigate human behaviour in closely controlled designs in controlled laboratory conditions. Facial emotion recognition has attracted a great deal of interest for interaction in virtual reality, healthcare system: Therapeutic applications, surveillance video application etc. In this paper, we propose a method for facial emotion recognition for immersive virtual environment based on 2D and 3D geometrical features. We used our collected dataset of 17 subjects' performance of six basic facial emotions (anger, fear, happiness, surprise, sadness, and neutral) using three devices: Kinect (v1), Kinect (v2), and RGB HD camera. In addition, we present the performance results of the RGB data for facial emotion recognition using Bagged Trees algorithm. To assess the performance of the proposed system, we used leave-one-out-subject cross-validation. We compared the 2D and 3D data performance for facial expression recognition. The obtained results show the superior performance of the RGB-D features provided by Kinect (v2). Our findings highlight that the 2D images are not robust enough for facial emotion recognition. The built facial emotion models will animate virtual characters that can express emotions via facial expressions. This could be deployed for Chatting, Learning and Therapeutic Intervention.
KW - avatar animation
KW - facial emotion recognition
KW - geometrical features
KW - immersive environment
KW - interaction
KW - machine learning
KW - Rgb
KW - Rgb-d
KW - virtual reality
UR - http://www.scopus.com/inward/record.url?scp=85110370167&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85110370167
VL - 2904
T3 - CEUR Workshop Proceedings
SP - 253
EP - 263
BT - ICCSA 2021 Conference on Computer Science's Complex Systems and their Applications 2021
A2 - Marir, Toufik
A2 - Bourouis, Abdelhabib
A2 - Benaboud, Rohallah
A2 - Gupta, Varun
A2 - Gupta, Chetna
PB - CEUR Workshop Proceedings
Y2 - 25 May 2021 through 26 May 2021
ER -