Emotional cues during simultaneous face and voice processing: electrophysiological insights

Detalhes bibliográficos
Autor(a) principal: Liu, Taosheng
Data de Publicação: 2012
Outros Autores: Pinheiro, Ana P., Zhao, Zhongxin, Nestor, Paul G., McCarley, Robert W., Niznikiewicz, Margaret
Tipo de documento: Artigo
Idioma: eng
Título da fonte: Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos)
Texto Completo: http://hdl.handle.net/1822/18131
Resumo: Both facial expression and tone of voice represent key signals of emotional communication but their brain processing correlates remain unclear. Accordingly, we constructed a novel implicit emotion recognition task consisting of simultaneously presented human faces and voices with neutral, happy, and angry valence, within the context of recognizing monkey faces and voices task. To investigate the temporal unfolding of the processing of affective information from human face-voice pairings, we recorded event-related potentials (ERPs) to these audiovisual test stimuli in 18 normal healthy subjects; N100, P200, N250, P300 components were observed at electrodes in the frontal-central region, while P100, N170, P270 were observed at electrodes in the parietal-occipital region. Results indicated a significant audiovisual stimulus effect on the amplitudes and latencies of components in frontal-central (P200, P300, and N250) but not the parietal occipital region (P100, N170 and P270). Specifically, P200 and P300 amplitudes were more positive for emotional relative to neutral audiovisual stimuli, irrespective of valence, whereas N250 amplitude was more negative for neutral relative to emotional stimuli. No differentiation was observed between angry and happy conditions. The results suggest that the general effect of emotion on audiovisual processing can emerge as early as 200 msec (P200 peak latency) post stimulus onset, in spite of implicit affective processing task demands, and that such effect is mainly distributed in the frontal-central region.
id RCAP_daceb654f2c999be66da12ee22b9c485
oai_identifier_str oai:repositorium.sdum.uminho.pt:1822/18131
network_acronym_str RCAP
network_name_str Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos)
repository_id_str 7160
spelling Emotional cues during simultaneous face and voice processing: electrophysiological insightsScience & TechnologyBoth facial expression and tone of voice represent key signals of emotional communication but their brain processing correlates remain unclear. Accordingly, we constructed a novel implicit emotion recognition task consisting of simultaneously presented human faces and voices with neutral, happy, and angry valence, within the context of recognizing monkey faces and voices task. To investigate the temporal unfolding of the processing of affective information from human face-voice pairings, we recorded event-related potentials (ERPs) to these audiovisual test stimuli in 18 normal healthy subjects; N100, P200, N250, P300 components were observed at electrodes in the frontal-central region, while P100, N170, P270 were observed at electrodes in the parietal-occipital region. Results indicated a significant audiovisual stimulus effect on the amplitudes and latencies of components in frontal-central (P200, P300, and N250) but not the parietal occipital region (P100, N170 and P270). Specifically, P200 and P300 amplitudes were more positive for emotional relative to neutral audiovisual stimuli, irrespective of valence, whereas N250 amplitude was more negative for neutral relative to emotional stimuli. No differentiation was observed between angry and happy conditions. The results suggest that the general effect of emotion on audiovisual processing can emerge as early as 200 msec (P200 peak latency) post stimulus onset, in spite of implicit affective processing task demands, and that such effect is mainly distributed in the frontal-central region.This work was supported by a joint PhD student scholarship (2009658022) from China Scholarship Council awarded to T.S.L., a Post-Doctoral Grant (SFRH/BPD/68967/2010) from Fundação para a Ciência e a Tecnologia - FCT (Portugal) awarded to A.P.P., and by the National Institute of Mental Health - NIMH (RO1 MH 040799 grant awarded to R.W.M.; RO3 MH 078036 grant awarded to M.A.N.).Public Library of ScienceUniversidade do MinhoLiu, TaoshengPinheiro, Ana P.Zhao, ZhongxinNestor, Paul G.McCarley, Robert W.Niznikiewicz, Margaret20122012-01-01T00:00:00Zinfo:eu-repo/semantics/publishedVersioninfo:eu-repo/semantics/articleapplication/pdfhttp://hdl.handle.net/1822/18131eng1932-620310.1371/journal.pone.003100122383987info:eu-repo/semantics/openAccessreponame:Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos)instname:Agência para a Sociedade do Conhecimento (UMIC) - FCT - Sociedade da Informaçãoinstacron:RCAAP2023-07-21T12:44:45Zoai:repositorium.sdum.uminho.pt:1822/18131Portal AgregadorONGhttps://www.rcaap.pt/oai/openaireopendoar:71602024-03-19T19:42:29.746007Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos) - Agência para a Sociedade do Conhecimento (UMIC) - FCT - Sociedade da Informaçãofalse
dc.title.none.fl_str_mv Emotional cues during simultaneous face and voice processing: electrophysiological insights
title Emotional cues during simultaneous face and voice processing: electrophysiological insights
spellingShingle Emotional cues during simultaneous face and voice processing: electrophysiological insights
Liu, Taosheng
Science & Technology
title_short Emotional cues during simultaneous face and voice processing: electrophysiological insights
title_full Emotional cues during simultaneous face and voice processing: electrophysiological insights
title_fullStr Emotional cues during simultaneous face and voice processing: electrophysiological insights
title_full_unstemmed Emotional cues during simultaneous face and voice processing: electrophysiological insights
title_sort Emotional cues during simultaneous face and voice processing: electrophysiological insights
author Liu, Taosheng
author_facet Liu, Taosheng
Pinheiro, Ana P.
Zhao, Zhongxin
Nestor, Paul G.
McCarley, Robert W.
Niznikiewicz, Margaret
author_role author
author2 Pinheiro, Ana P.
Zhao, Zhongxin
Nestor, Paul G.
McCarley, Robert W.
Niznikiewicz, Margaret
author2_role author
author
author
author
author
dc.contributor.none.fl_str_mv Universidade do Minho
dc.contributor.author.fl_str_mv Liu, Taosheng
Pinheiro, Ana P.
Zhao, Zhongxin
Nestor, Paul G.
McCarley, Robert W.
Niznikiewicz, Margaret
dc.subject.por.fl_str_mv Science & Technology
topic Science & Technology
description Both facial expression and tone of voice represent key signals of emotional communication but their brain processing correlates remain unclear. Accordingly, we constructed a novel implicit emotion recognition task consisting of simultaneously presented human faces and voices with neutral, happy, and angry valence, within the context of recognizing monkey faces and voices task. To investigate the temporal unfolding of the processing of affective information from human face-voice pairings, we recorded event-related potentials (ERPs) to these audiovisual test stimuli in 18 normal healthy subjects; N100, P200, N250, P300 components were observed at electrodes in the frontal-central region, while P100, N170, P270 were observed at electrodes in the parietal-occipital region. Results indicated a significant audiovisual stimulus effect on the amplitudes and latencies of components in frontal-central (P200, P300, and N250) but not the parietal occipital region (P100, N170 and P270). Specifically, P200 and P300 amplitudes were more positive for emotional relative to neutral audiovisual stimuli, irrespective of valence, whereas N250 amplitude was more negative for neutral relative to emotional stimuli. No differentiation was observed between angry and happy conditions. The results suggest that the general effect of emotion on audiovisual processing can emerge as early as 200 msec (P200 peak latency) post stimulus onset, in spite of implicit affective processing task demands, and that such effect is mainly distributed in the frontal-central region.
publishDate 2012
dc.date.none.fl_str_mv 2012
2012-01-01T00:00:00Z
dc.type.status.fl_str_mv info:eu-repo/semantics/publishedVersion
dc.type.driver.fl_str_mv info:eu-repo/semantics/article
format article
status_str publishedVersion
dc.identifier.uri.fl_str_mv http://hdl.handle.net/1822/18131
url http://hdl.handle.net/1822/18131
dc.language.iso.fl_str_mv eng
language eng
dc.relation.none.fl_str_mv 1932-6203
10.1371/journal.pone.0031001
22383987
dc.rights.driver.fl_str_mv info:eu-repo/semantics/openAccess
eu_rights_str_mv openAccess
dc.format.none.fl_str_mv application/pdf
dc.publisher.none.fl_str_mv Public Library of Science
publisher.none.fl_str_mv Public Library of Science
dc.source.none.fl_str_mv reponame:Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos)
instname:Agência para a Sociedade do Conhecimento (UMIC) - FCT - Sociedade da Informação
instacron:RCAAP
instname_str Agência para a Sociedade do Conhecimento (UMIC) - FCT - Sociedade da Informação
instacron_str RCAAP
institution RCAAP
reponame_str Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos)
collection Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos)
repository.name.fl_str_mv Repositório Científico de Acesso Aberto de Portugal (Repositórios Cientìficos) - Agência para a Sociedade do Conhecimento (UMIC) - FCT - Sociedade da Informação
repository.mail.fl_str_mv
_version_ 1799132978674139136