Active-contour based-on face emotion patterns

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review


Complex Human-Robot Interactions (HRI), e.g. occurring in assistive-aid applications, require the automatic understanding of human facial expressions in order the intelligent agent could adequately and timely react when facing a particular emotion of its human interlocutor. In this work, we present a new approach to automatically detect and recognize such facial emotions. On one hand, active contours are first applied to detect the face and the changes/moves of its parts (i.e of the mouth, the eyes, etc.), leading to visual patterns which consist in a new quantification of face expressions. On the other hand, an innovative tree-structure based on these detected patterns is proposed in order to effectively recognize the human emotions, such as happiness or surprise, by both increasing the automated process robustness and its computational efficiency. We have evaluated our algorithms on standard face datasets. Our proposed approach has shown excellent performance, outperforming state-of-the-art methods.
Original languageEnglish
Title of host publicationIAPR International Conference on Computer Analysis of Images and Patterns 2017
Subtitle of host publicationRecognition and Action for Scene Understanding Workshop 2017
EditorsJorge Diaz George Azzopardi, Rebeca Marfil
PublisherUniversity of Malaga
Number of pages11
ISBN (Print)978-84-608-8176-6
Publication statusPublished - 2017
EventIAPR International Conference on Computer Analysis of Images and Patterns: REACTS Workshop - Copenhague, Denmark
Duration: 22 Aug 201825 Aug 2018


ConferenceIAPR International Conference on Computer Analysis of Images and Patterns
Abbreviated titleCAIP


  • facial expressions
  • emotion modelling
  • computer vision and robotics


Dive into the research topics of 'Active-contour based-on face emotion patterns'. Together they form a unique fingerprint.

Cite this