Implicit processing of visual emotions is affected by sound-induced affective states and individual affective traits

Tiziana Quarto, Giuseppe Blasi, Karen Johanne Pallesen, Alessandro Bertolino, Elvira Brattico

Research output: Contribution to journalArticle

11 Citations (Scopus)

Abstract

The ability to recognize emotions contained in facial expressions are affected by both affective traits and states and varies widely between individuals. While affective traits are stable in time, affective states can be regulated more rapidly by environmental stimuli, such as music, that indirectly modulate the brain state. Here, we tested whether a relaxing or irritating sound environment affects implicit processing of facial expressions. Moreover, we investigated whether and how individual traits of anxiety and emotional control interact with this process. 32 healthy subjects performed an implicit emotion processing task (presented to subjects as a gender discrimination task) while the sound environment was defined either by a) a therapeutic music sequence (MusiCure), b) a noise sequence or c) silence. Individual changes in mood were sampled before and after the task by a computerized questionnaire. Additionally, emotional control and trait anxiety were assessed in a separate session by paper and pencil questionnaires. Results showed a better mood after the MusiCure condition compared with the other experimental conditions and faster responses to happy faces during MusiCure compared with angry faces during Noise. Moreover, individuals with higher trait anxiety were faster in performing the implicit emotion processing task during MusiCure compared with Silence. These findings suggest that sound-induced affective states are associated with differential responses to angry and happy emotional faces at an implicit stage of processing, and that a relaxing sound environment facilitates the implicit emotional processing in anxious individuals.

Original languageEnglish
Article numbere103278
JournalPLoS One
Volume9
Issue number7
DOIs
Publication statusPublished - Jul 29 2014

Fingerprint

emotions
Emotions
Acoustic waves
Facial Expression
Anxiety
anxiety
Music
Processing
Noise
music
Aptitude
questionnaires
processing stages
Healthy Volunteers
Brain
brain
therapeutics
gender
Surveys and Questionnaires
Therapeutics

ASJC Scopus subject areas

  • Agricultural and Biological Sciences(all)
  • Biochemistry, Genetics and Molecular Biology(all)
  • Medicine(all)

Cite this

Implicit processing of visual emotions is affected by sound-induced affective states and individual affective traits. / Quarto, Tiziana; Blasi, Giuseppe; Pallesen, Karen Johanne; Bertolino, Alessandro; Brattico, Elvira.

In: PLoS One, Vol. 9, No. 7, e103278, 29.07.2014.

Research output: Contribution to journalArticle

Quarto, Tiziana ; Blasi, Giuseppe ; Pallesen, Karen Johanne ; Bertolino, Alessandro ; Brattico, Elvira. / Implicit processing of visual emotions is affected by sound-induced affective states and individual affective traits. In: PLoS One. 2014 ; Vol. 9, No. 7.
@article{ed27eb790c3e4632b0681cf435b8db9a,
title = "Implicit processing of visual emotions is affected by sound-induced affective states and individual affective traits",
abstract = "The ability to recognize emotions contained in facial expressions are affected by both affective traits and states and varies widely between individuals. While affective traits are stable in time, affective states can be regulated more rapidly by environmental stimuli, such as music, that indirectly modulate the brain state. Here, we tested whether a relaxing or irritating sound environment affects implicit processing of facial expressions. Moreover, we investigated whether and how individual traits of anxiety and emotional control interact with this process. 32 healthy subjects performed an implicit emotion processing task (presented to subjects as a gender discrimination task) while the sound environment was defined either by a) a therapeutic music sequence (MusiCure), b) a noise sequence or c) silence. Individual changes in mood were sampled before and after the task by a computerized questionnaire. Additionally, emotional control and trait anxiety were assessed in a separate session by paper and pencil questionnaires. Results showed a better mood after the MusiCure condition compared with the other experimental conditions and faster responses to happy faces during MusiCure compared with angry faces during Noise. Moreover, individuals with higher trait anxiety were faster in performing the implicit emotion processing task during MusiCure compared with Silence. These findings suggest that sound-induced affective states are associated with differential responses to angry and happy emotional faces at an implicit stage of processing, and that a relaxing sound environment facilitates the implicit emotional processing in anxious individuals.",
author = "Tiziana Quarto and Giuseppe Blasi and Pallesen, {Karen Johanne} and Alessandro Bertolino and Elvira Brattico",
year = "2014",
month = "7",
day = "29",
doi = "10.1371/journal.pone.0103278",
language = "English",
volume = "9",
journal = "PLoS One",
issn = "1932-6203",
publisher = "Public Library of Science",
number = "7",

}

TY - JOUR

T1 - Implicit processing of visual emotions is affected by sound-induced affective states and individual affective traits

AU - Quarto, Tiziana

AU - Blasi, Giuseppe

AU - Pallesen, Karen Johanne

AU - Bertolino, Alessandro

AU - Brattico, Elvira

PY - 2014/7/29

Y1 - 2014/7/29

N2 - The ability to recognize emotions contained in facial expressions are affected by both affective traits and states and varies widely between individuals. While affective traits are stable in time, affective states can be regulated more rapidly by environmental stimuli, such as music, that indirectly modulate the brain state. Here, we tested whether a relaxing or irritating sound environment affects implicit processing of facial expressions. Moreover, we investigated whether and how individual traits of anxiety and emotional control interact with this process. 32 healthy subjects performed an implicit emotion processing task (presented to subjects as a gender discrimination task) while the sound environment was defined either by a) a therapeutic music sequence (MusiCure), b) a noise sequence or c) silence. Individual changes in mood were sampled before and after the task by a computerized questionnaire. Additionally, emotional control and trait anxiety were assessed in a separate session by paper and pencil questionnaires. Results showed a better mood after the MusiCure condition compared with the other experimental conditions and faster responses to happy faces during MusiCure compared with angry faces during Noise. Moreover, individuals with higher trait anxiety were faster in performing the implicit emotion processing task during MusiCure compared with Silence. These findings suggest that sound-induced affective states are associated with differential responses to angry and happy emotional faces at an implicit stage of processing, and that a relaxing sound environment facilitates the implicit emotional processing in anxious individuals.

AB - The ability to recognize emotions contained in facial expressions are affected by both affective traits and states and varies widely between individuals. While affective traits are stable in time, affective states can be regulated more rapidly by environmental stimuli, such as music, that indirectly modulate the brain state. Here, we tested whether a relaxing or irritating sound environment affects implicit processing of facial expressions. Moreover, we investigated whether and how individual traits of anxiety and emotional control interact with this process. 32 healthy subjects performed an implicit emotion processing task (presented to subjects as a gender discrimination task) while the sound environment was defined either by a) a therapeutic music sequence (MusiCure), b) a noise sequence or c) silence. Individual changes in mood were sampled before and after the task by a computerized questionnaire. Additionally, emotional control and trait anxiety were assessed in a separate session by paper and pencil questionnaires. Results showed a better mood after the MusiCure condition compared with the other experimental conditions and faster responses to happy faces during MusiCure compared with angry faces during Noise. Moreover, individuals with higher trait anxiety were faster in performing the implicit emotion processing task during MusiCure compared with Silence. These findings suggest that sound-induced affective states are associated with differential responses to angry and happy emotional faces at an implicit stage of processing, and that a relaxing sound environment facilitates the implicit emotional processing in anxious individuals.

UR - http://www.scopus.com/inward/record.url?scp=84905027889&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84905027889&partnerID=8YFLogxK

U2 - 10.1371/journal.pone.0103278

DO - 10.1371/journal.pone.0103278

M3 - Article

VL - 9

JO - PLoS One

JF - PLoS One

SN - 1932-6203

IS - 7

M1 - e103278

ER -