Academic journal article Canadian Journal of Music Therapy

Emotion without Words: A Comparison Study of Music and Speech Prosody/L'émotion Sans Mots : Une éTude Comparative De la Prosodie Musicale et De Prosodie De la Parole

Academic journal article Canadian Journal of Music Therapy

Emotion without Words: A Comparison Study of Music and Speech Prosody/L'émotion Sans Mots : Une éTude Comparative De la Prosodie Musicale et De Prosodie De la Parole

Article excerpt

Strong connections have been identified between music and language, especially in relation to evolutionary background (Perlovsky, 2012), brain connectivity (Koelsch, Gunter, Wittfoth, & Sammler, 2005), and skill transfer (Besson, Chobert, & Marie, 2011). Some evolutionary theorists have suggested the connection between music and language is in their communicative uses (Cross, 2009; Juslin & Laukka, 2003), and a common communicative use is emotional communication. It has been further suggested that language is a more advanced form of emotional communication derived from music (Mithen, 2009) and that both music and language derive from a pre-linguistic system, which shared elements of music and language for communicative purposes (Masataka, 2009). Such theoretical ideas combined with empirical research on the topic (e.g., Johnansson, 2008; Levitin & Menon, 2003; Patel, 2008) show there are many shared similarities between music and language in terms of emotional communication. As emotional expression is considered paramount in communicating internal feelings and actions to others (Scherer, 1995), and both music and language have been shown to effectively communicate emotions (Steinbeis & Koelsch, 2008), it is useful to compare the similarities and differences in how emotion is communicated through both music and language. These connections will be explored throughout the literature review.

Literature Review

Emotional Communication in Language and Music

Emotional communication in language and music is thought to be related to the prosody of language (Pell, 2006), the dynamics of music (Van der Zwaag, Westerlink, & Van den Broek, 2011), and how they effectively convey meaning and emotion. Prosody in language is defined by intonation, loudness, and tempo (Mitchell, Elliott, Barry, Cruttenden, & Woodruff, 2003) and can be independent of the lexical elements of speech, defined as the word and word-like elements of language (Friederici, Meyer, & von Cramon, 2000). The combination of semantic content and different combinations of prosodic elements lead to emotional communication. The dynamic aspects of music, which help to express emotion, are said to include but are not limited to tempo, mode, harmony, tonality, pitch, rhythm, tension-resolution patterns, and timing (Thompson, 2009). Different combinations of prosodic or dynamic information display differences in the type of emotion communicated both in music and in language. Interestingly, neuroimaging studies have shown a primary emotion pathway activated in response to both musical and spoken emotional content as well as distinct networks activating different areas of the right hemisphere of the brain (Steinbeis & Koelsch, 2008). Music and language also share processing pathways, and the processing of music and language interact and affect each other in the brain (Fiveash & Pammer, 2014). The ability of both speech and music to communicate emotions has been widely researched, and a growing body of evidence is pointing towards strong connections between music and language, particularly in relation to their respective emotional communication abilities.

Communicative Connections Between Music and Language

The extent of the connection between music and speech emotions can be seen when looking at neuropsychological cases where impairment in one domain affects performance in the other. While there are many processing differences between music and speech (Zatorre & Baum, 2012; Zatorre, Belin, & Penhune, 2002), there are also many similarities. For example, Nicholson, Baum, Kilgour, Koh, Munhall, and Cuddy (2003) studied an amusic patient who, following a right hemisphere stroke, was unable to detect differences in music pitch and rhythm or recognize different melodies. After numerous tests they found that the patient's pitch and rhythm recognition in speech was similarly affected. He was unable to process intonation or discriminate a question from a statement. …

Search by... Author
Show... All Results Primary Sources Peer-reviewed

Oops!

An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.