The Neuroscience of Language

21
The Neuroscience of Language

description

The Neuroscience of Language. What is language? What is it for?. Rapid efficient communication (as such, other kinds of communication might be called language for our purposes and might share underlying neural mechanisms) Two broad but interacting domains: Comprehension Production. - PowerPoint PPT Presentation

Transcript of The Neuroscience of Language

Page 1: The Neuroscience of Language

The Neuroscience of Language

Page 2: The Neuroscience of Language

What is language? What is it for?

• Rapid efficient communication– (as such, other kinds of communication might be

called language for our purposes and might share underlying neural mechanisms)

• Two broad but interacting domains:– Comprehension– Production

Page 3: The Neuroscience of Language

Speech comprehension

• Is an auditory task – (but stay tuned for the McGurk Effect!)

• Is also a selective attention task– Auditory scene analysis

• Is a temporal task– We need a way to represent both frequency

(pitch) and time when talking about language -> the speech spectrogram

Page 4: The Neuroscience of Language

Speech comprehension• Is also a selective attention

task– Auditory scene analysis

• Which streams of sound constiutute speech?

• Which one stream constitutes the to-be-comprehended speech

• Not a trivial problem because sound waves combine prior to reaching the ear

Page 5: The Neuroscience of Language

Speech comprehension

• Is a temporal task– Speech is a time-varying signal

– It is meaningless to freeze a word in time (like you can do with an image)

– We need a way to consider both frequency (pitch) and time when talking about language -> the speech spectrogram

Page 6: The Neuroscience of Language

What forms the basis of spoken language?

• Phonemes

• Phonemes strung together over time with prosody

Page 7: The Neuroscience of Language

What forms the basis of spoken language?

• Phonemes = smallest perceptual unit of sound

• Phonemes strung together over time with prosody

Page 8: The Neuroscience of Language

What forms the basis of spoken language?

• Phonemes = smallest perceptual unit of sound

• Phonemes strung together over time with prosody = the variation of pitch and loudness over the time scale of a whole sentence

Page 9: The Neuroscience of Language

What forms the basis of spoken language?

• Phonemes = smallest perceptual unit of sound

• Phonemes strung together over time with prosody = the variation of pitch and loudness over the time scale of a whole sentence

To visualize these we need slick acoustic analysis software…which I’ve got

Page 10: The Neuroscience of Language

What forms the basis of spoken language?

• The auditory system is inherently tonotopic

Page 11: The Neuroscience of Language

Is speech comprehension therefore an image matching problem?

• If your brain could just match the picture on the basilar membrane with a lexical object in memory, speech would be comprehended

Page 12: The Neuroscience of Language

Problems facing the brain

•Acoustic - Phonetic invariance –says that phonemes should match one and only one pattern in the spectrogram

–This is not the case! For example /d/ followed by different vowels:

Page 13: The Neuroscience of Language

Problems facing the brain

• The Segmentation Problem:– The stream of acoustic input is not physically segmented into discrete

phonemes, words, phrases, etc.

– Silent gaps don’t always indicate (aren’t perceived as) interruptions in speech

Page 14: The Neuroscience of Language

Problems facing the brain

• The Segmentation Problem:– The stream of acoustic input is not physically segmented into discrete

phonemes, words, phrases, etc.

– Continuous speech stream is sometimes perceived as having gaps

Page 15: The Neuroscience of Language

How (where) does the brain solve these problems?

– Note that the brain can’t know that incoming sound is speech until it first figures out that it isn’t !?

– Signal chain goes from non-specific -> specific

– Neuroimaging has to take the same approach to track down speech-specific regions

Page 16: The Neuroscience of Language

Functional Anatomy of Speech Comprehension

• low-level auditory pathway is not specialized for speech sounds

• Both speech and non-speech sounds activate primary auditory cortex (bilateral Heschl’s Gyrus) on the top of the superior temporal gyrus

Page 17: The Neuroscience of Language

Functional Anatomy of Speech Comprehension

• Which parts of the auditory pathway are specialized for speech?

• Binder et al. (2000)– fMRI– Presented several kinds of stimuli:• white noise• pure tones• non-words• reversed words• real words

These have non-word-like acoustical properties

These have word-like acoustical properties but no lexical associations

word-like acoustical properties and lexical associations

Page 18: The Neuroscience of Language

Functional Anatomy of Speech Comprehension

• Relative to “baseline” scanner noise

– Widespread auditory cortex activation (bilaterally) for all stimuli

– Why isn’t this surprising?

Page 19: The Neuroscience of Language

Functional Anatomy of Speech Comprehension

• Statistical contrasts reveal specialization for speech-like sounds– superior temporal gyrus– Somewhat more prominent on left side

Page 20: The Neuroscience of Language

Functional Anatomy of Speech Comprehension

• Further highly sensitive contrasts to identify specialization for words relative to other speech-like sounds revealed only a few small clusters of voxels

• Brodmann areas– Area 39– 20, 21 and 37– 46 and 10

Page 21: The Neuroscience of Language

Next time we’ll discuss

• Speech production• Aphasia• Lateralization