Mystery of how we process sarcasm is solved

A brain area key to understanding speech has been identified by scientists.

The neurons respond to changes in vocal pitch – crucial to conveying the meaning of what is being said. 

Tone is so important to language the exact same words can have a very different message.

For example, ‘Anna likes oranges’ can be a statement, or if the pitch varies near the end, the phrase can be posed as a question, ‘Anna likes oranges?’

Pitch is also key to understanding sarcasm and emotions such as anger.  

This animation highlights pitch-sensing cells in a small brain area known as the superior temporal gyrus (STG). As the pitch of the sentence goes high (red), neural activity in certain areas increased (credit: Carla Schaffer / AAAS)

PITCH IN SPEECH 

The study found that changes in vocal pitch, part of what linguists call ‘speech prosody’, are almost as fundamental to human communication as melody is to music.

In tonal languages such as Mandarin Chinese, pitch changes can completely alter a word’s meaning.

But even in a non-tonal language like English, differences in pitch can significantly change the meaning of a spoken sentence.

The brain’s ability to interpret these changes in tone on the fly is remarkable given that each speaker has their own typical vocal pitch and style – some have low voices and others high.

The brain must track and interpret these pitch changes while simultaneously parsing which consonants and vowels are being uttered, what words they form, and how they combine into phrases and sentences – within a millisecond.

Humans are so good at discerning it they can even recognise musical melodies when the notes are transposed.

Now, neuroscientist Claire Tang and colleagues at the University of California, San Francisco, have found the brain cells responsible by studying ten patients suffering from epilepsy.

Ms Tang said: ‘One of the lab’s missions is to understand how the brain converts sounds into meaning.

‘What we are seeing here is there are neurons in the brain’s neocortex that are processing not just what words are being said, but how those words are said.’

Her team released a video highlighting these cells in a small area known as the superior temporal gyrus (STG).

As the pitch of the sentence goes high, neural activity – shown in the colour of the circle on the brain – increased.

Implants monitored the participants’ brainwaves as they were exposed to synthesised voices that varied in intonation.

While awaiting surgery, the volunteers were fitted with the electrodes to ensure their brains weren’t damaged.

They were asked by the researchers to listen to recordings of four sentences as spoken by three different synthesised voices.

They included ‘Humans value genuine behaviour’, ‘Movies demand minimal energy’, ‘Reindeer are a visual animal’ and ‘Lawyers give a relevant opinion’.

The team found some neurons could distinguish between three synthesised speakers based on differences in their average vocal pitch range.

Neurons in the brain's neocortex were found that process not just what words are being said, but how those words are said. The neurons respond differently to high pitched sounds (pictured) than low pitched ones

Neurons in the brain’s neocortex were found that process not just what words are being said, but how those words are said. The neurons respond differently to high pitched sounds (pictured) than low pitched ones

Others could distinguish between four sentences, no matter which speaker was saying them, based on the different kinds of sounds – ‘reindeer’ sounds different from ‘lawyers’ no matter who is talking.

And yet another group of neurons could distinguish between the four different intonation patterns.

These neurons changed their activity depending on where the emphasis fell in the sentence, regardless of which sentence it was or who was saying it.

The data revealed regions that responded the same way to completely different sentences that were presented with the exact same pitch pattern.

Researchers also spotted there were distinct neural responses for males and females, who tend to have absolute differences in pitch.

Some neurons changed their activity depending on where the emphasis fell to a lower tone (pictured) in the sentence, regardless of which sentence it was or who was saying it

Some neurons changed their activity depending on where the emphasis fell to a lower tone (pictured) in the sentence, regardless of which sentence it was or who was saying it

The study, published in Science, found changes in vocal pitch, part of what linguists call ‘speech prosody’, are almost as fundamental to human communication as melody is to music.

In tonal languages such as Mandarin Chinese, pitch changes can completely alter the meaning of a word.

But even in a non-tonal language like English, differences in pitch can significantly change the meaning of a spoken sentence.

The brain’s ability to interpret these changes in tone on the fly is remarkable, as each speaker also has their own typical vocal pitch and style – some have low voices and others high.

The brain must track and interpret these pitch changes while simultaneously parsing which consonants and vowels are being uttered, what words they form, and how they combine into phrases and sentences – within milliseconds.

The study found that changes in vocal pitch, part of what linguists call 'speech prosody', are almost as fundamental to human communication as melody is to music (stock image)

The study found that changes in vocal pitch, part of what linguists call ‘speech prosody’, are almost as fundamental to human communication as melody is to music (stock image)

The sentences were designed to have the same length and construction, and could be played with four different intonations: Neutral, emphasising the first word, emphasising the third word, or as a question.

The researchers then exposed the participants to sentences spoken by hundreds of male and female speakers, detecting areas of the brain that were tuned to high relative pitch and low pitch, respectively.

This showed while the neurons responsive to the different speakers were focused on absolute pitch, the ones responsive to intonation were more focused on relative pitch – how it changed from moment to moment during the recording.

Ms Tang said: ‘To me this was one of the most exciting aspects of our study.

‘We were able to show not just where prosody is encoded in the brain, but also how, by explaining the activity in terms of specific changes in vocal pitch.’

Senior author Professor Edward Chang added: ‘Now, a major unanswered question is how the brain controls our vocal tracts to make these intonational speech sounds.

‘We hope we can solve this mystery soon.’

Read more at DailyMail.co.uk