Skip to main content Accessibility help
×
Home

Can experience with co-speech gesture influence the prosody of a sign language? Sign language prosodic cues in bimodal bilinguals*

  • DIANE BRENTARI (a1), MARIE A. NADOLSKE (a2) and GEORGE WOLFORD (a3)

Abstract

In this paper the prosodic structure of American Sign Language (ASL) narratives is analyzed in deaf native signers (L1-D), hearing native signers (L1-H), and highly proficient hearing second language signers (L2-H). The results of this study show that the prosodic patterns used by these groups are associated both with their ASL language experience (L1 or L2) and with their hearing status (deaf or hearing), suggesting that experience using co-speech gesture (i.e. gesturing while speaking) may have some effect on the prosodic cues used by hearing signers, similar to the effects of the prosodic structure of an L1 on an L2.

Copyright

Corresponding author

Address for correspondence: Diane Brentari, Department of Linguistics, University of Chicago, 1010 East 59th Street, Chicago, IL 60637-1512, USA dbrentari@uchicago.edu

Footnotes

Hide All
*

This research was supported by NSF grant BCS 0547554 to Diane Brentari and NIH DC5241 Ronnie B. Wilbur. We also thank two anonymous reviewers for helpful comments on earlier versions of this manuscript.

Footnotes

References

Hide All
Aarons, D. (1994). Aspects of syntax of American Sign Language. Ph.D. dissertation, Boston University.
Baker, C., & Padden, C. (1978). Focusing on the non-manual components of American Sign Language. In Siple, P. (ed.), Understanding language through sign language research, pp. 2757. New York: Academic Press.
Boyes Braem, B. (1999). Rhythmic temporal patterns in the signing of early and late learners of German Swiss Sign Language. Language and Speech, 42, 177208.
Brentari, D. (1998). A prosodic model of sign language phonology. Cambridge, MA: MIT Press.
Brentari, D. (ed.) (2010). Sign languages: A Cambridge language survey. Cambridge: Cambridge University Press.
Brentari, D., & Crossley, L. (2002). Prosody on the hands and face: Evidence from American Sign Language. Sign Language and Linguistics, 5, 105130.
Brentari, D., González, C., Seidl, A., & Wilbur, R. (2011). Sensitivity to visual prosodic cues in signers and nonsigners. Language and Speech, 54 (1), 4972.
Brentari, D., & Poizner, H. (1994) A phonological analysis of a deaf Parkinsonian signer. Language and Cognitive Processes, 9 (1), 69100.
Casey, S., & Emmorey, K. (2008). Co-speech gesture in bimodal bilinguals. Language and Cognitive Process, 24, 290312.
Cohn, J., Reed, L. I., Ambadar, Z., Xiao, J., & Moriyama, T. (2004). Automatic analysis and recognition of brow actions and head motion in spontaneous facial behavior. IEEE Internal Conference on Systems, Man, and Cybernetics, pp. 210–616. http://ieeexplore.ieee.org (retrieved October 31, 2011).
Duncan, S. (1996). Grammatical form and ‘thinking-for-speaking’ in Mandarin Chinese and English: An analysis based on speech-accompanying gestures. Ph.D. dissertation, University of Chicago.
Emmorey, K., & McCullough, S. (2008). The bimodal bilingual brain: Effects of sign language experience. Brain and Language, 109, 124132.
Engberg-Pedersen, E. (1995). Point of view expressed through shifters. In Emmorey, K. & Reilly, J. (eds.), Language, gesture and space, pp. 133170. Mahwah, NJ: Lawrence Erlbaum.
Fenlon, J., Denmark, T., Campbell, R., & Woll, B. (2007). Seeing sentence boundaries. Sign Language & Linguistics, 10, 177200.
Gee, J., & Kegl, J. (1993). Narrative/story structure, pausing, and American Sign Language. Discourse Processes, 6, 243258.
Grosjean, F., & Dechamps, A. (1975). Analyse contrastive des variables temporelles de l'anglais et du français: Vitesse de parole et variables composantes, phénomènes. Phonetica, 31, 144184.
Grosjean, F., & Lane, H. (1977). Pauses and syntax in American Sign Language. Cognition, 5, 101117.
Gullberg, M. (2010). Methodological reflections on gesture analysis in second language acquisition and bilingualism research. Second Language Research, 26, 75102.
Gullberg, M. (2006). Some reasons for studying gesture and second language acquisition (Hommage à Adam Kendon). International Review of Applied Linguistics, 44, 103124.
House, D. (2007). Integrating audio and visual cues for speaker friendliness in multimodal speech synthesis. Proceedings from the 8th Annual Interspeech Conference, pp. 1250–1253. Antwerp: International Speech Communication Association.
Kendon, A. (2004). Gesture: Visible action as utterance. Cambridge: Cambridge University Press.
Kipp, M., Neff, M., Kipp, K., & Albrecht, I. (2007). Towards natural gesture synthesis: Evaluating gesture units in a data-driven approach to gesture synthesis. Proceedings of the 7th International Conference on Intelligent Virtual Agents, pp. 1528. Berlin: Springer Verlag.
Kita, S. (2009). Cross-cultural variation in speech-accompanying gesture. Language and Cognitive Processes, 24, 145167.
Klatt, D. (1976). Linguistic uses of segmental duration in English: Acoustic and perceptual evidence. Journal of the Acoustical Society of America, 59, 12081221.
Krahmer, E., & Swerts, M. (2005). How children and adults produce and perceive uncertainty in audiovisual speech, Language and Speech, 48, 2953.
Krahmer, E., & Swerts, M. (2007). The effects of visual beats on prosodic prominence: Acoustic analyses, auditory perception and visual perception. Journal of Memory and Language, 57, 396414.
Krahmer, E., & Swerts, M. (2009). Audiovisual prosody – introduction to the special issue. Language and Speech, 52 (2–3), 129133.
Liddell, S. (1978). Non-manual signals and relative clauses in American Sign Language. In Siple, P. (ed.), Understanding sign language through sign language research, pp. 5990. New York: Academic Press.
Liddell, S., & Johnson, R. (1989). American Sign Language: The phonological base. Sign Language Studies, 64, 197277.
Loehr, D. (2004). Gesture and intonation. Ph.D. dissertation, Georgetown University.
Loehr, D. (2007). Aspects of rhythm in gesture and speech. Gesture, 7 (2), 179214.
Lucas, C., & Valli, C. (1992). Language contact in the American Deaf community. New York: Academic Press.
Mazzoni, L. (2008). Classificatori e impersonamento nella Lingua dei Segni Italiana. Pisa: Plus.
McClave, E. (1998). Pitch and manual gestures. Journal of Psycholinguistic Research, 27, 6990.
McClave, E., Kim, H., Tamer, R., & Mileff, M. (2007). Head movements in the context of speech in Arabic, Bulgarian, Korean, and African-American Vernacular English. Gesture, 7 (3), 343390.
McGory, J. (1997). The acquisition of intonation patterns in English by native speakers of Korean and Mandarin. Ph.D. dissertation, The Ohio State University.
McNeill, D. (1992). Hand and mind: What gestures reveal about thought. Chicago: University of Chicago Press.
McNeill, D. (2000). Language and gesture. Cambridge: Cambridge University Press.
Morgan, G. (2000). Discourse cohesion in sign and speech. International Journal of Bilingualism, 4 (3), 279300.
Nadolske, M. (2009). Mastering the pressures of variation: A cognitive linguistic examination of advanced hearing ASL L2 signers. Ph.D. dissertation, Purdue University.
Nespor, M., & Sandler, W. (1999). Prosody in Israeli Sign Language. Language and Speech, 42, 143176.
Nespor, M., & Vogel, I. (1986). Prosodic phonology. Dordrecht: Foris.
Nicodemus, B. (2010). Prosodic markers and utterance boundaries in American Sign Language interpretation. Washington, DC: Gallaudet University Press.
Parrill, F. (2010). Viewpoint in speech–gesture integration: Linguistic structure, discourse structure, and event structure. Language and Cognitive Processes, 25, 650668.
Perlmutter, D. (1992). Sonority and syllable structure in American Sign Language. Linguistic Inquiry, 23, 407442.
Pfau, R., & Quer, J. (2010). Nonmanuals: Their grammatical and prosodic roles. In Brentari, (ed.), pp. 381–402.
Poggi, I., Pelachaud, C., & De Rosis, F. (2000). Eye communication in a conversational 3D synthetic agent. AI Communications, 13, 169181.
Pyers, J., & Emmorey, K. (2008). The face of bimodal bilingualism: Grammatical markers in American Sign Language are produced when bilinguals speak to English monolinguals. Psychological Science, 19, 531536.
Queen, R. (2001). Bilingual intonation patterns: Evidence of language change in Turkish–German bilingual children. Language in Society, 30, 5580.
Quinto-Pozos, D., & Mehta, S. (2010). Register variation in mimetic gestural complements to signed language. Journal of Pragmatics, 42, 557584.
Riazantseva, A. (2002). Second language proficiency and pausing: A study of Russian speakers. Studies in Second Language Acquisition, 23, 487526.
Sandler, W., & Lillo-Martin, D. (2006). Sign language and linguistic universals. Cambridge: Cambridge University Press.
Stam, G. (2006). Thinking for speaking about motion: L1 and L2 speech and gesture. International Review Applied Linguistics, 44, 145171.
Tang, G., Brentari, D., González, C., & Sze, F. (2010). Crosslinguistic variation in the use of prosodic cues: The case of blinks. In Brentari, (ed.), pp. 519–541.
Trouvain, J., & Gut, U. (2007). Non-native prosody: Phonetic description and teaching practice. Berlin: Mouton de Gruyter.
Tyrone, M., Nam, H., Saltzman, E., Mathur, G., & Goldstein, L. (2010). Prosody and movement in American Sign Language: A task-dynamics approach. Presented at the Prosody 2010 Conference, Chicago.
Wilbur, R. B. (1994). Eyeblinks & ASL phrase structure. Sign Language Studies, 84, 221240.
Wilbur, R. B., & Patschke, C. G. (1998). Body leans and the marking of contrast in American Sign Language. Journal of Pragmatics, 30, 275303.
Wilcox, S. (2007). Routes from gesture to language. In Pizzuto, E., Pietrandrea, P. & Simone, R. (eds.), Verbal and signed languages: Comparing structures, constructs and methodologies, pp. 107131. Berlin: Mouton de Gruyter.
Wilcox, S., Rossini, P., & Pizzuto, E. A. (2010). Grammaticalization in sign languages. In Brentari, (ed.), pp. 332–356.

Keywords

Metrics

Full text views

Total number of HTML views: 0
Total number of PDF views: 0 *
Loading metrics...

Abstract views

Total abstract views: 0 *
Loading metrics...

* Views captured on Cambridge Core between <date>. This data will be updated every 24 hours.

Usage data cannot currently be displayed