" It's not what you say, but how you say it ": A reciprocal temporo-frontal network for affective prosody

David I. Leitman, Daniel H. Wolf, John D Ragland, Petri Laukka, James Loughead, Jeffrey N. Valdez, Daniel C. Javitt, Bruce I. Turetsky, Ruben C. Gur

Research output: Contribution to journalArticle

98 Citations (Scopus)

Abstract

Humans communicate emotion vocally by modulating acoustic cues such as pitch, intensity and voice quality. Research has documented how the relative presence or absence of such cues alters the likelihood of perceiving an emotion, but the neural underpinnings of acoustic cue-dependent emotion perception remain obscure. Using functional magnetic resonance imaging in 20 subjects we examined a reciprocal circuit consisting of superior temporal cortex, amygdala and inferior frontal gyrus that may underlie affective prosodic comprehension. Results showed that increased saliency of emotion-specifi c acoustic cues was associated with increased activation in superior temporal cortex [planum temporale (PT), posterior superior temporal gyrus (pSTG), and posterior superior middle gyrus (pMTG)] and amygdala, whereas decreased saliency of acoustic cues was associated with increased inferior frontal activity and temporo-frontal connectivity. These results suggest that sensory-integrative processing is facilitated when the acoustic signal is rich in affective information, yielding increased activation in temporal cortex and amygdala. Conversely, when the acoustic signal is ambiguous, greater evaluative processes are recruited, increasing activation in inferior frontal gyrus (IFG) and IFG STG connectivity. Auditory regions may thus integrate acoustic information with amygdala input to form emotion-specifi c representations, which are evaluated within inferior frontal regions.

Original languageEnglish (US)
JournalFrontiers in Human Neuroscience
Volume4
DOIs
StatePublished - 2010

Fingerprint

Acoustics
Cues
Emotions
Amygdala
Temporal Lobe
Prefrontal Cortex
Voice Quality
Temazepam
Magnetic Resonance Imaging
Research

Keywords

  • Amygdala
  • Auditory cortex
  • Emotion
  • Inferior frontal gyrus
  • Prosody
  • Speech

ASJC Scopus subject areas

  • Psychiatry and Mental health
  • Neurology
  • Biological Psychiatry
  • Behavioral Neuroscience
  • Neuropsychology and Physiological Psychology

Cite this

" It's not what you say, but how you say it " : A reciprocal temporo-frontal network for affective prosody. / Leitman, David I.; Wolf, Daniel H.; Ragland, John D; Laukka, Petri; Loughead, James; Valdez, Jeffrey N.; Javitt, Daniel C.; Turetsky, Bruce I.; Gur, Ruben C.

In: Frontiers in Human Neuroscience, Vol. 4, 2010.

Research output: Contribution to journalArticle

Leitman, David I. ; Wolf, Daniel H. ; Ragland, John D ; Laukka, Petri ; Loughead, James ; Valdez, Jeffrey N. ; Javitt, Daniel C. ; Turetsky, Bruce I. ; Gur, Ruben C. / " It's not what you say, but how you say it " : A reciprocal temporo-frontal network for affective prosody. In: Frontiers in Human Neuroscience. 2010 ; Vol. 4.
@article{438d3a6443834fc692e37e097c99b109,
title = "{"} It's not what you say, but how you say it {"}: A reciprocal temporo-frontal network for affective prosody",
abstract = "Humans communicate emotion vocally by modulating acoustic cues such as pitch, intensity and voice quality. Research has documented how the relative presence or absence of such cues alters the likelihood of perceiving an emotion, but the neural underpinnings of acoustic cue-dependent emotion perception remain obscure. Using functional magnetic resonance imaging in 20 subjects we examined a reciprocal circuit consisting of superior temporal cortex, amygdala and inferior frontal gyrus that may underlie affective prosodic comprehension. Results showed that increased saliency of emotion-specifi c acoustic cues was associated with increased activation in superior temporal cortex [planum temporale (PT), posterior superior temporal gyrus (pSTG), and posterior superior middle gyrus (pMTG)] and amygdala, whereas decreased saliency of acoustic cues was associated with increased inferior frontal activity and temporo-frontal connectivity. These results suggest that sensory-integrative processing is facilitated when the acoustic signal is rich in affective information, yielding increased activation in temporal cortex and amygdala. Conversely, when the acoustic signal is ambiguous, greater evaluative processes are recruited, increasing activation in inferior frontal gyrus (IFG) and IFG STG connectivity. Auditory regions may thus integrate acoustic information with amygdala input to form emotion-specifi c representations, which are evaluated within inferior frontal regions.",
keywords = "Amygdala, Auditory cortex, Emotion, Inferior frontal gyrus, Prosody, Speech",
author = "Leitman, {David I.} and Wolf, {Daniel H.} and Ragland, {John D} and Petri Laukka and James Loughead and Valdez, {Jeffrey N.} and Javitt, {Daniel C.} and Turetsky, {Bruce I.} and Gur, {Ruben C.}",
year = "2010",
doi = "10.3389/fnhum.2010.00019",
language = "English (US)",
volume = "4",
journal = "Hematology / the Education Program of the American Society of Hematology. American Society of Hematology. Education Program",
issn = "1662-5161",
publisher = "Frontiers Research Foundation",

}

TY - JOUR

T1 - " It's not what you say, but how you say it "

T2 - A reciprocal temporo-frontal network for affective prosody

AU - Leitman, David I.

AU - Wolf, Daniel H.

AU - Ragland, John D

AU - Laukka, Petri

AU - Loughead, James

AU - Valdez, Jeffrey N.

AU - Javitt, Daniel C.

AU - Turetsky, Bruce I.

AU - Gur, Ruben C.

PY - 2010

Y1 - 2010

N2 - Humans communicate emotion vocally by modulating acoustic cues such as pitch, intensity and voice quality. Research has documented how the relative presence or absence of such cues alters the likelihood of perceiving an emotion, but the neural underpinnings of acoustic cue-dependent emotion perception remain obscure. Using functional magnetic resonance imaging in 20 subjects we examined a reciprocal circuit consisting of superior temporal cortex, amygdala and inferior frontal gyrus that may underlie affective prosodic comprehension. Results showed that increased saliency of emotion-specifi c acoustic cues was associated with increased activation in superior temporal cortex [planum temporale (PT), posterior superior temporal gyrus (pSTG), and posterior superior middle gyrus (pMTG)] and amygdala, whereas decreased saliency of acoustic cues was associated with increased inferior frontal activity and temporo-frontal connectivity. These results suggest that sensory-integrative processing is facilitated when the acoustic signal is rich in affective information, yielding increased activation in temporal cortex and amygdala. Conversely, when the acoustic signal is ambiguous, greater evaluative processes are recruited, increasing activation in inferior frontal gyrus (IFG) and IFG STG connectivity. Auditory regions may thus integrate acoustic information with amygdala input to form emotion-specifi c representations, which are evaluated within inferior frontal regions.

AB - Humans communicate emotion vocally by modulating acoustic cues such as pitch, intensity and voice quality. Research has documented how the relative presence or absence of such cues alters the likelihood of perceiving an emotion, but the neural underpinnings of acoustic cue-dependent emotion perception remain obscure. Using functional magnetic resonance imaging in 20 subjects we examined a reciprocal circuit consisting of superior temporal cortex, amygdala and inferior frontal gyrus that may underlie affective prosodic comprehension. Results showed that increased saliency of emotion-specifi c acoustic cues was associated with increased activation in superior temporal cortex [planum temporale (PT), posterior superior temporal gyrus (pSTG), and posterior superior middle gyrus (pMTG)] and amygdala, whereas decreased saliency of acoustic cues was associated with increased inferior frontal activity and temporo-frontal connectivity. These results suggest that sensory-integrative processing is facilitated when the acoustic signal is rich in affective information, yielding increased activation in temporal cortex and amygdala. Conversely, when the acoustic signal is ambiguous, greater evaluative processes are recruited, increasing activation in inferior frontal gyrus (IFG) and IFG STG connectivity. Auditory regions may thus integrate acoustic information with amygdala input to form emotion-specifi c representations, which are evaluated within inferior frontal regions.

KW - Amygdala

KW - Auditory cortex

KW - Emotion

KW - Inferior frontal gyrus

KW - Prosody

KW - Speech

UR - http://www.scopus.com/inward/record.url?scp=77957680260&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=77957680260&partnerID=8YFLogxK

U2 - 10.3389/fnhum.2010.00019

DO - 10.3389/fnhum.2010.00019

M3 - Article

C2 - 20204074

AN - SCOPUS:77957680260

VL - 4

JO - Hematology / the Education Program of the American Society of Hematology. American Society of Hematology. Education Program

JF - Hematology / the Education Program of the American Society of Hematology. American Society of Hematology. Education Program

SN - 1662-5161

ER -