On the role of crossmodal prediction in audiovisual emotion perceptionCitation formats

  • Authors:
  • Sarah Jessen
  • Sonja A. Kotz

Standard

On the role of crossmodal prediction in audiovisual emotion perception. / Jessen, Sarah; Kotz, Sonja A.

In: Frontiers in Human Neuroscience, 25.06.2013.

Research output: Contribution to journalArticle

Harvard

APA

Vancouver

Author

Jessen, Sarah ; Kotz, Sonja A. / On the role of crossmodal prediction in audiovisual emotion perception. In: Frontiers in Human Neuroscience. 2013.

Bibtex

@article{3e291bbdfb074423914561857c7c8dce,
title = "On the role of crossmodal prediction in audiovisual emotion perception",
abstract = "Humans rely on multiple sensory modalities to determine the emotional state of others. In fact, such multisensory perception may be one of the mechanisms explaining the ease and efficiency by which others' emotions are recognized. But how and when exactly do the different modalities interact? One aspect in multisensory perception that has received increasing interest in recent years is the concept of crossmodal prediction. In emotion perception, as in most other settings, visual information precedes the auditory one. Thereby, leading in visual information can facilitate subsequent auditory processing. While this mechanism has often been described in audiovisual speech perception, it has not been addressed so far in audiovisual emotion perception. Based on the current state of the art in (a) crossmodal prediction and (b) multisensory emotion perception research, we propose that it is essential to consider the former in order to fully understand the latter. Focusing on electroencephalographic (EEG) and magnetoencephalographic (MEG) studies, we provide a brief overview of the current research in both fields. In discussing these findings, we suggest that emotional visual information may allow for a more reliable prediction of auditory information compared to non-emotional visual information. In support of this hypothesis, we present a re-analysis of a previous data set that shows an inverse correlation between the N1 response in the EEG and the duration of visual emotional but not non-emotional information. If the assumption that emotional content allows for more reliable predictions can be corroborated in future studies, crossmodal prediction is a crucial factor in our understanding of multisensory emotion perception. {\textcopyright} 2013 Jessen and Kotz.",
keywords = "Audiovisual, Crossmodal Prediction, EEG, Emotion, Multisensory",
author = "Sarah Jessen and Kotz, {Sonja A.}",
year = "2013",
month = jun
day = "25",
doi = "10.3389/fnhum.2013.00369",
language = "English",
journal = "Frontiers in Human Neuroscience",
issn = "1662-5161",
publisher = "Frontiers Media S. A.",

}

RIS

TY - JOUR

T1 - On the role of crossmodal prediction in audiovisual emotion perception

AU - Jessen, Sarah

AU - Kotz, Sonja A.

PY - 2013/6/25

Y1 - 2013/6/25

N2 - Humans rely on multiple sensory modalities to determine the emotional state of others. In fact, such multisensory perception may be one of the mechanisms explaining the ease and efficiency by which others' emotions are recognized. But how and when exactly do the different modalities interact? One aspect in multisensory perception that has received increasing interest in recent years is the concept of crossmodal prediction. In emotion perception, as in most other settings, visual information precedes the auditory one. Thereby, leading in visual information can facilitate subsequent auditory processing. While this mechanism has often been described in audiovisual speech perception, it has not been addressed so far in audiovisual emotion perception. Based on the current state of the art in (a) crossmodal prediction and (b) multisensory emotion perception research, we propose that it is essential to consider the former in order to fully understand the latter. Focusing on electroencephalographic (EEG) and magnetoencephalographic (MEG) studies, we provide a brief overview of the current research in both fields. In discussing these findings, we suggest that emotional visual information may allow for a more reliable prediction of auditory information compared to non-emotional visual information. In support of this hypothesis, we present a re-analysis of a previous data set that shows an inverse correlation between the N1 response in the EEG and the duration of visual emotional but not non-emotional information. If the assumption that emotional content allows for more reliable predictions can be corroborated in future studies, crossmodal prediction is a crucial factor in our understanding of multisensory emotion perception. © 2013 Jessen and Kotz.

AB - Humans rely on multiple sensory modalities to determine the emotional state of others. In fact, such multisensory perception may be one of the mechanisms explaining the ease and efficiency by which others' emotions are recognized. But how and when exactly do the different modalities interact? One aspect in multisensory perception that has received increasing interest in recent years is the concept of crossmodal prediction. In emotion perception, as in most other settings, visual information precedes the auditory one. Thereby, leading in visual information can facilitate subsequent auditory processing. While this mechanism has often been described in audiovisual speech perception, it has not been addressed so far in audiovisual emotion perception. Based on the current state of the art in (a) crossmodal prediction and (b) multisensory emotion perception research, we propose that it is essential to consider the former in order to fully understand the latter. Focusing on electroencephalographic (EEG) and magnetoencephalographic (MEG) studies, we provide a brief overview of the current research in both fields. In discussing these findings, we suggest that emotional visual information may allow for a more reliable prediction of auditory information compared to non-emotional visual information. In support of this hypothesis, we present a re-analysis of a previous data set that shows an inverse correlation between the N1 response in the EEG and the duration of visual emotional but not non-emotional information. If the assumption that emotional content allows for more reliable predictions can be corroborated in future studies, crossmodal prediction is a crucial factor in our understanding of multisensory emotion perception. © 2013 Jessen and Kotz.

KW - Audiovisual

KW - Crossmodal Prediction

KW - EEG

KW - Emotion

KW - Multisensory

U2 - 10.3389/fnhum.2013.00369

DO - 10.3389/fnhum.2013.00369

M3 - Article

JO - Frontiers in Human Neuroscience

JF - Frontiers in Human Neuroscience

SN - 1662-5161

ER -