This dissertation studies the neural basis of song, a universal human behavior. The relationship of words and melodies in the perception of song at phonological, semantic, melodic, and rhythmic levels of processing was investigated using the fine temporal resolution of Electroencephalography (EEG). The observations reported here may shed light on a ubiquitous human experience and also inform the discussion of whether language and music share neural resources or recruit domain-specific neural mechanisms.
Experiment 1 was designed to determine whether words and melody in song are processed interactively or independently. Participants listened to sung words in which the melodies and/or the words were similar or different, and performed a same/different task while attending to the linguistic and musical dimensions in separate blocks of trials. Event-Related Potentials and behavioral data converged in showing interactive processing between the linguistic and musical dimensions of sung words, regardless of the direction of attention. In particular, the N400 component, a well-established marker of semantic processing, was modulated by musical melody. The observation that variations in musical features affect lexico-semantic processing in sung language was a novel finding with implications for shared neural resources between language and music.
Experiment 2 was designed to explore the idea that well-aligned textsettings, in which the strong syllables occur on strong beats, capture listeners’ attention and help them understand song lyrics. EEG was recorded while participants listened to sung sentences whose linguistic stress patterns were well-aligned, misaligned, or had variable alignment with the musical meter, and performed a lexical decision task on subsequently presented visual targets. Results showed that induced beta and evoked gamma power were modulated differently for well-aligned and misaligned syllables, and that task performance was adversely affected when visual targets followed misaligned and varied sentences. These findings suggest that alignment of linguistic stress and musical meter in song enhance beat tracking and linguistic segmentation by entraining periodic fluctuations in high frequency brain activity to the stimuli. A series of follow-up studies has been outlined to further investigate the relationship between rhythmic attending in speech and music, and the influence of metrical alignment in songs on childhood language acquisition.
|Advisor:||Large, Edward N.|
|School:||Florida Atlantic University|
|School Location:||United States -- Florida|
|Source:||DAI-B 71/10, Dissertation Abstracts International|
|Subjects:||Linguistics, Neurosciences, Music|
|Keywords:||Brain, Neural behavioral correlates, Prosody, Rhythm, Song, Song cognition|
Copyright in each Dissertation and Thesis is retained by the author. All Rights Reserved
The supplemental file or files you are about to download were provided to ProQuest by the author as part of a
dissertation or thesis. The supplemental files are provided "AS IS" without warranty. ProQuest is not responsible for the
content, format or impact on the supplemental file(s) on our system. in some cases, the file type may be unknown or
may be a .exe file. We recommend caution as you open such files.
Copyright of the original materials contained in the supplemental file is retained by the author and your access to the
supplemental files is subject to the ProQuest Terms and Conditions of use.
Depending on the size of the file(s) you are downloading, the system may take some time to download them. Please be