A hybrid explainable AI framework applied to global and local facial expression recognition

M. Deramgozin, S. Jovanovic, H. Rabah, N. Ramzan

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Citation (Scopus)
35 Downloads (Pure)


Facial Expression Recognition (FER) systems have many applications such as human behavior understanding, human machine interface, video games and health monitoring. The main advantage of the traditional white box methods is their explainability. However, the accuracy of recognition of these methods is completely reliant on the extracted features. On the other hand, the use of deep neural networks has advantage regarding the overall precision compared to traditional methods. Indeed, they are considered as black box methods and thus suffer from lack of reliability and explainability. In this work, we introduce a hybrid AI explainable framework (HEF) composed of a main functional pipeline comprising a Convolutional Neural Network (CNN) to classify input images and an explainable pipeline using Facial Action Units and application agnostic models LIME providing more useful data allowing to explain the obtained results and reinforce the decision provided by the main functional pipeline. The proposed HEF has been validated on the CK+ dataset and shows very promising results in terms of explainability of the obtained results.
Original languageEnglish
Title of host publication2021 IEEE International Conference on Imaging Systems and Techniques (IST)
Place of PublicationPiscataway, NJ
ISBN (Electronic)9781728173719
Publication statusPublished - 27 Dec 2021


  • facial expression recognition
  • convolutional neural networks (CNN)
  • eXplainable artificial intelligence (XAI)
  • emotion classification
  • multi layer perceptron (MLP)


Dive into the research topics of 'A hybrid explainable AI framework applied to global and local facial expression recognition'. Together they form a unique fingerprint.

Cite this