The term "gesture" in hearing mindset and literature tends to evoke an association with hands. Possibly because, the movements of articulators are external and visible.
Second, hearing people use speech language in vocal-aural modality in combination with manual gesture. That makes it look like speech and (manual) gesture are separate and signed language and manual gesture are on the same level.
For the sake of clarification and re-definition, gesture can be either visual-spatial or aural-vocal form. And, vocal gesture exists. Vocal gesture may be used in combination with sign language.
Here is a few quick facts before discussing gesture and word. First, neuroscience research shows that languages activate in the linguistic regions (Broca's and Wernicke's areas) in both signing and speech.
Second, signed and spoken languages are on the similar timeline of language acquisition from birth. Forget the claims of 'baby sign language.' (another post).
ASL is not English on hands nor gestures. ASL and English are different languages of their own, like French, German, Chinese, etc. (spoken languages) and Auslan, French SL, Chinese, SL, etc. (signed languages).
A sign (or a signed word) is a word. ASL is a language. Language is amodal. Signing = talking, speaking, etc. as in non-written form.
"The scientists tested whether sign language taps into the same parts of the brain as charades. They wanted to figure out whether the brain regards sign language as more similar to spoken language, or more similar to making pantomime gestures to mimic an action." [Source]
"The scientists showed both deaf people and hearing people pictures of objects, such as a broom or a bottle of syrup, and asked the subjects to 'show how you would use this object.' Then they asked the deaf subjects to sign the verbs associated with particular objects, such as syrup or broom"
"The researchers found that the signers activated different parts of their brains when pantomiming versus when signing." Even if a sign is iconic. "When similar hand gestures [signed words] are used – the brain treats it like language."
"And the scans showed that the brain areas signers used when pantomiming were similar to the brain areas hearing participants used when pantomiming – both groups activated the superior parietal cortex, which is associated with grasping, rather than brain areas connected to language." [Source]
While several ASL signs (or words) are iconic, there are also many spoken words that are iconic (or a fancy word, 'onomatopeoia'), such as bang, beep, chirp, clap, cough, flush, growl, knock, pop, rip, screech, slap tweet, zip and endless more. Thus, one can say several ASL words are onomatopoeic. Both spoken and signed languages have iconic words that cannot be always readily understood by foreigners.
An experiment tested whether hearing participants can understand unfamiliar and iconic ASL words (signs). First, they didn't get a single right answer. Next, a group of participants were shown the signs with multiple choices. Not many right answers.
Just like mime and word process in different parts of the brain, gesture and word activate in different parts of the brain. Let's look at a very powerful evidence of pronoun in sign language.
Both hearing and deaf babies similarly use prelinguistic communicative or meaningful gestures such as pointing and raising one's arms to be picked up. Gestural finger-pointing emerges at the stage of 9-10 months in both hearing and deaf infants, even if the hearing children are not exposed to sign language.
Linguistics studies show that English-speaking toddlers don't develop the proper usage of pronouns until about 18 months. At this age, they make errors with personal pronouns until about 24 months. E.g. substituting "you" for "me" or vice versa.
Now, the interesting thing is that gestural pointing (at 9 months) and pronouns in ASL are in the same form. The pronouns are pretty transparent or iconic. E.g. "you" by pointing to the addressee and "me" by pointing to oneself.
So, the question is would native ASL-speaking children exposed to ASL from birth acquire pronouns earlier with no errors than English-speaking children?
The answer in Petitto's study (1987) is (surprisingly or not surprisingly) no.
Even though the prelinguistic pointing gesture and the linguistic pointing pronoun share the same form, the development and acquisition for pronouns in both languages/modalities are on the same timeline!
In my personal longitudinal study of my child, I must witness myself this fascinating nature.
Speakers can distinguish words from gestures in a spoken form. They can distinguish linguistic intonation from vocal emotion. But, when hearing non-signing people are new to signed language, sometimes they are confused whether a movement of the hand is just a gesture or a word.
Scenario: At the end of my mid-semester masters-level review with the advisors, an advisor whispered asking our English-ASL interpreter (well, she's supposed to ask me instead of the interpreter), "I noticed that she was twiddling with her ear. What does it mean in ASL?" The interpreter replied, "No, she was just fidgeting. Just like one fidgets with a cup, paper, or pen." The interpreter then immediately informed me (she did the right thing). It was amusing because I thought it was very obvious. Fidget is not even remotely a gesture!
In ASL classes, occasionally I itched my face or such and one of my ASL students looked at me as if I was saying something in ASL. On the other hand, native ASL speakers, even native ASL-speaking youngsters, can easily and quickly distinguish one from the other.
This video clip -- a good real-life example -- illustrates the child Juli itched her cheek (not gesture nor word), then spoke "itch" in ASL and resumed her itch (action, not gesture nor word). The manual gesture, the action, and the (signed/spoken) word "itch" can be distinguisable, at least for a fluent/native signer.
Deaf signers do use (manual) gestures blended into signed words, just as hearing speakers uses vocal gestures integrated with spoken words. It would be no surprise that some Deaf people use vocal gesture while signing, just as hearing people use manual gesture while vocally speaking. Many Deaf people choose to turn off vocal gesture when manually speaking, just like hearing people restrain manual gesture when vocally speaking.
Just like the law of complementarity. Like the Mobius strip.
Linguists and neuroscientists such as Dr. Petitto, Dr. Emmorey, etc. say that those studies suggest that the linguistic regions in the brain is organized for language, not for speech.
Gestures (vocal or manual) are non-linguistic expressions whereas words (in signed form) are linguistic.
Posted 2016. Updated 2021.
Karen Emmorey. Language, cognition, and the brain: insights from sign language research. pp 183-184.
Enter a keyword in the field box below to search or filter the new topic list and click on the link.
New to sign language? "Where do I start?" or "How do I start learning sign language?" This ASL Rookie guide lists some selected links to the tutorials for ASL beginners to get started and keep rolling. It may be a useful review for intermediate-level learners and ASL students as well.
Some tutorial pages are a mix of free and premium versions. Access to premium content and links below are available in the PatronPlus subscription. More links/posts will be added from time to time.
Are you able to carry everyday conversations in ASL? Are you a student in the intermediate levels and beyond, who wishes to boost up your signing skills? You've come to the selected tutorial series. (Some premium content are available to PatronPlus membership.)
Stories, poems, performance arts, etc. in sign language.
This documentation project follows a child's language acquisition, literacy development, and phonological acquisition in sign language, specifically ASL, from newborn to age five in a natural native-ASL environment and visual culture.