EEG responses to auditory stimuli for automatic affect recognition

Dirk T. Hettich, Elaina Bolinger, Tamara Matuz, Niels Birbaumer, Wolfgang Rosenstiel, Martin Spüler

Research output: Contribution to journalArticlepeer-review


Brain state classification for communication and control has been well established in the area of brain-computer interfaces over the last decades. Recently, the passive and automatic extraction of additional information regarding the psychological state of users from neurophysiological signals has gained increased attention in the interdisciplinary field of affective computing. We investigated how well specific emotional reactions, induced by auditory stimuli, can be detected in EEG recordings. We introduce an auditory emotion induction paradigm based on the International Affective Digitized Sounds 2nd Edition (IADS-2) database also suitable for disabled individuals. Stimuli are grouped in three valence categories: unpleasant, neutral, and pleasant. Significant differences in time domain domain event-related potentials are found in the electroencephalogram (EEG) between unpleasant and neutral, as well as pleasant and neutral conditions over midline electrodes. Time domain data were classified in three binary classification problems using a linear support vector machine (SVM) classifier. We discuss three classification performance measures in the context of affective computing and outline some strategies for conducting and reporting affect classification studies.

Original languageEnglish
Article number244
JournalFrontiers in Neuroscience
Issue numberJUN
Publication statusPublished - Jun 10 2016


  • Affective computing
  • Brain-computer interface
  • Classification
  • Event-related potential
  • Late positive potential
  • Machine learning
  • Support vector machine

ASJC Scopus subject areas

  • Neuroscience(all)


Dive into the research topics of 'EEG responses to auditory stimuli for automatic affect recognition'. Together they form a unique fingerprint.

Cite this