Audio-visual interactions for motion perception in depth modulate activity in visual area V3A

Akitoshi Ogawa, Emiliano Macaluso

Research output: Contribution to journalArticle

15 Citations (Scopus)

Abstract

Multisensory signals can enhance the spatial perception of objects and events in the environment. Changes of visual size and auditory intensity provide us with the main cues about motion direction in depth. However, frequency changes in audition and binocular disparity in vision also contribute to the perception of motion in depth. Here, we presented subjects with several combinations of auditory and visual depth-cues to investigate multisensory interactions during processing of motion in depth. The task was to discriminate the direction of auditory motion in depth according to increasing or decreasing intensity. Rising or falling auditory frequency provided an additional within-audition cue that matched or did not match the intensity change (i.e. intensity-frequency (IF) "matched vs. unmatched" conditions). In two-thirds of the trials, a task-irrelevant visual stimulus moved either in the same or opposite direction of the auditory target, leading to audio-visual "congruent vs. incongruent" between-modalities depth-cues. Furthermore, these conditions were presented either with or without binocular disparity. Behavioral data showed that the best performance was observed in the audio-visual congruent condition with IF matched. Brain imaging results revealed maximal response in visual area V3A when all cues provided congruent and reliable depth information (i.e. audio-visual congruent, IF-matched condition including disparity cues). Analyses of effective connectivity revealed increased coupling from auditory cortex to V3A specifically in audio-visual congruent trials. We conclude that within- and between-modalities cues jointly contribute to the processing of motion direction in depth, and that they do so via dynamic changes of connectivity between visual and auditory cortices.

Original languageEnglish
Pages (from-to)158-167
Number of pages10
JournalNeuroImage
Volume71
DOIs
Publication statusPublished - May 1 2013

Fingerprint

Motion Perception
Cues
Vision Disparity
Auditory Cortex
Hearing
Visual Cortex
Neuroimaging
Direction compound

Keywords

  • Audio-visual interaction
  • Binocular disparity
  • Effective connectivity
  • FMRI
  • Motion discrimination
  • V3A

ASJC Scopus subject areas

  • Cognitive Neuroscience
  • Neurology

Cite this

Audio-visual interactions for motion perception in depth modulate activity in visual area V3A. / Ogawa, Akitoshi; Macaluso, Emiliano.

In: NeuroImage, Vol. 71, 01.05.2013, p. 158-167.

Research output: Contribution to journalArticle

Ogawa, Akitoshi ; Macaluso, Emiliano. / Audio-visual interactions for motion perception in depth modulate activity in visual area V3A. In: NeuroImage. 2013 ; Vol. 71. pp. 158-167.
@article{ec278daba7894b2885b94ed6812cb23c,
title = "Audio-visual interactions for motion perception in depth modulate activity in visual area V3A",
abstract = "Multisensory signals can enhance the spatial perception of objects and events in the environment. Changes of visual size and auditory intensity provide us with the main cues about motion direction in depth. However, frequency changes in audition and binocular disparity in vision also contribute to the perception of motion in depth. Here, we presented subjects with several combinations of auditory and visual depth-cues to investigate multisensory interactions during processing of motion in depth. The task was to discriminate the direction of auditory motion in depth according to increasing or decreasing intensity. Rising or falling auditory frequency provided an additional within-audition cue that matched or did not match the intensity change (i.e. intensity-frequency (IF) {"}matched vs. unmatched{"} conditions). In two-thirds of the trials, a task-irrelevant visual stimulus moved either in the same or opposite direction of the auditory target, leading to audio-visual {"}congruent vs. incongruent{"} between-modalities depth-cues. Furthermore, these conditions were presented either with or without binocular disparity. Behavioral data showed that the best performance was observed in the audio-visual congruent condition with IF matched. Brain imaging results revealed maximal response in visual area V3A when all cues provided congruent and reliable depth information (i.e. audio-visual congruent, IF-matched condition including disparity cues). Analyses of effective connectivity revealed increased coupling from auditory cortex to V3A specifically in audio-visual congruent trials. We conclude that within- and between-modalities cues jointly contribute to the processing of motion direction in depth, and that they do so via dynamic changes of connectivity between visual and auditory cortices.",
keywords = "Audio-visual interaction, Binocular disparity, Effective connectivity, FMRI, Motion discrimination, V3A",
author = "Akitoshi Ogawa and Emiliano Macaluso",
year = "2013",
month = "5",
day = "1",
doi = "10.1016/j.neuroimage.2013.01.012",
language = "English",
volume = "71",
pages = "158--167",
journal = "NeuroImage",
issn = "1053-8119",
publisher = "Academic Press Inc.",

}

TY - JOUR

T1 - Audio-visual interactions for motion perception in depth modulate activity in visual area V3A

AU - Ogawa, Akitoshi

AU - Macaluso, Emiliano

PY - 2013/5/1

Y1 - 2013/5/1

N2 - Multisensory signals can enhance the spatial perception of objects and events in the environment. Changes of visual size and auditory intensity provide us with the main cues about motion direction in depth. However, frequency changes in audition and binocular disparity in vision also contribute to the perception of motion in depth. Here, we presented subjects with several combinations of auditory and visual depth-cues to investigate multisensory interactions during processing of motion in depth. The task was to discriminate the direction of auditory motion in depth according to increasing or decreasing intensity. Rising or falling auditory frequency provided an additional within-audition cue that matched or did not match the intensity change (i.e. intensity-frequency (IF) "matched vs. unmatched" conditions). In two-thirds of the trials, a task-irrelevant visual stimulus moved either in the same or opposite direction of the auditory target, leading to audio-visual "congruent vs. incongruent" between-modalities depth-cues. Furthermore, these conditions were presented either with or without binocular disparity. Behavioral data showed that the best performance was observed in the audio-visual congruent condition with IF matched. Brain imaging results revealed maximal response in visual area V3A when all cues provided congruent and reliable depth information (i.e. audio-visual congruent, IF-matched condition including disparity cues). Analyses of effective connectivity revealed increased coupling from auditory cortex to V3A specifically in audio-visual congruent trials. We conclude that within- and between-modalities cues jointly contribute to the processing of motion direction in depth, and that they do so via dynamic changes of connectivity between visual and auditory cortices.

AB - Multisensory signals can enhance the spatial perception of objects and events in the environment. Changes of visual size and auditory intensity provide us with the main cues about motion direction in depth. However, frequency changes in audition and binocular disparity in vision also contribute to the perception of motion in depth. Here, we presented subjects with several combinations of auditory and visual depth-cues to investigate multisensory interactions during processing of motion in depth. The task was to discriminate the direction of auditory motion in depth according to increasing or decreasing intensity. Rising or falling auditory frequency provided an additional within-audition cue that matched or did not match the intensity change (i.e. intensity-frequency (IF) "matched vs. unmatched" conditions). In two-thirds of the trials, a task-irrelevant visual stimulus moved either in the same or opposite direction of the auditory target, leading to audio-visual "congruent vs. incongruent" between-modalities depth-cues. Furthermore, these conditions were presented either with or without binocular disparity. Behavioral data showed that the best performance was observed in the audio-visual congruent condition with IF matched. Brain imaging results revealed maximal response in visual area V3A when all cues provided congruent and reliable depth information (i.e. audio-visual congruent, IF-matched condition including disparity cues). Analyses of effective connectivity revealed increased coupling from auditory cortex to V3A specifically in audio-visual congruent trials. We conclude that within- and between-modalities cues jointly contribute to the processing of motion direction in depth, and that they do so via dynamic changes of connectivity between visual and auditory cortices.

KW - Audio-visual interaction

KW - Binocular disparity

KW - Effective connectivity

KW - FMRI

KW - Motion discrimination

KW - V3A

UR - http://www.scopus.com/inward/record.url?scp=84873301537&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84873301537&partnerID=8YFLogxK

U2 - 10.1016/j.neuroimage.2013.01.012

DO - 10.1016/j.neuroimage.2013.01.012

M3 - Article

C2 - 23333414

AN - SCOPUS:84873301537

VL - 71

SP - 158

EP - 167

JO - NeuroImage

JF - NeuroImage

SN - 1053-8119

ER -