Fusing highly dimensional energy and connectivity features to identify affective states from EEG signals

Pablo Arnau-González, Miguel Arevalillo-Herráez, Naeem Ramzan

Research output: Contribution to journalArticlepeer-review

51 Citations (Scopus)
298 Downloads (Pure)

Abstract

In this paper, a novel method for affect detection is presented. The method combines both connectivity-based and channel-based features with a selection method that considerably reduces the dimensionality of the data and allows for an efficient classification. In particular, the Relative Energy (RE) and its logarithm in the spacial domain, and the Spectral Power (SP) in the frequency domain are computed for the four typical frequency bands (α, β, γ and θ), and complemented with the Mutual Information measured over all channel pairs. The resulting features are then reduced by using a hybrid method that combines supervised and unsupervised feature selection. First, Welch’s t-test is used to select the features that best separate the classes, and discard the ones that are less useful for classification. To this end, all features where the t-test yields a p-value above a threshold are eliminated. The remaining ones are further reduced by using Principal Component Analysis. Detection results are compared to state-of-the-art methods on DEAP, a database for emotion analysis composed of labeled recordings from 32 subjects while watching 40 music videos. The effect of using different classifiers is also evaluated, and a significant improvement is observed in all cases.
Original languageEnglish
Pages (from-to)81-89
Number of pages9
JournalNeurocomputing
Volume244
Early online date18 Mar 2017
DOIs
Publication statusE-pub ahead of print - 18 Mar 2017

Keywords

  • EEG
  • Connectivity features
  • Energy features
  • Emotion recognition
  • Feature reduction
  • Feature extraction

Fingerprint

Dive into the research topics of 'Fusing highly dimensional energy and connectivity features to identify affective states from EEG signals'. Together they form a unique fingerprint.

Cite this