Manual gesture vs signed word: what the brain tells

Gesture in hearing literature tends to evoke an association with hands. My art works deconstruct this concept. And so is the neuroscience.

First, it's important to understand the foundation that neuroscience research shows that languages equally activate in the linguistic regions (Broca's and Wernicke's areas) in signing and speech. For those of you who don't know, ASL and English are different languages of their own, like French, German, Chinese, etc. (spoken languages) and Auslan, French SL, Chinese, SL, etc. (signed languages). ASL is not English on hands.

Furthermore, "the scientists tested whether sign language taps into the same parts of the brain as charades. They wanted to figure out whether the brain regards sign language as more similar to spoken language, or more similar to making pantomime gestures to mimic an action." [Source]

"The scientists showed both deaf people and hearing people pictures of objects, such as a broom or a bottle of syrup, and asked the subjects to 'show how you would use this object.' Then they asked the deaf subjects to sign the verbs associated with particular objects, such as syrup or broom"

Result?

Not surprisingly (or surprisingly), "the researchers found that the signers activated different parts of their brains when pantomiming versus when signing." Even if the sign is iconic. "When similar hand gestures are used – the brain treats it like language."

I need to remind that while many ASL signs (or words) are iconic, there are also many spoken words that are iconic (or a fancy word, 'onomatopeoia'), such as bang, beep, chirp, clap, cough, flush, growl, knock, pop, rip, screech, slap tweet, zip and endless more. Thus, I can say many ASL words are onomatopoeic. Even, spoken and signed languages have iconic words that cannot be readily understood by foreigners.

That is:

"And the scans showed that the brain areas signers used when pantomiming were similar to the brain areas hearing participants used when pantomiming – both groups activated the superior parietal cortex, which is associated with grasping, rather than brain areas connected to language." [Source]

Linguists and neuroscientists such as Dr. Petitto, Dr. Emmorey, etc. say that those studies suggest that the linguistic regions in the brain is organized for language, not for speech.

Speakers can distinguish words from gestures in a spoken form. They can distinguish linguistic intonation from vocal emotion. But, hearing non-signing people are new to signed language that sometimes they are confused whether a movement of the hand is just a gesture or a word.

Scenario: At the end of my mid-semester masters-level review with the advisors, an advisor whispered asking our English-ASL interpreter (well, she's supposed to ask me instead of the interpreter), "I noticed that she was twiddling with her ear. What does it mean in ASL?" The interpreter replied, "No, she was just fidgeting. Just like one fidgets with a cup, paper, or pen." The interpreter then immediately informed me (she did the right thing) and I chuckled. It was amusing because I thought it was very obvious.

In ASL classes, occasionally I itched my face or such and one of my ASL students looked at me as if I was saying something in ASL. I waved no.

On the other hand, native ASL speakers, even native ASL-speaking babies, can easily distinguish one from another.

In this video clip, the child Juli itched her cheek, then spoke "itch" in ASL and resumed her itch (action, not gesture). The gesture, the action, and the word "itch" are distinguisable.

Yes, Deaf signers do use (manual) gestures blended into signed words, just as hearing speakers uses vocal gestures integrated with spoken words. Even more, they do use manual gesture while vocally speaking. It would be no surprise that Deaf people use vocal gesture while signing.

Just like the law of complementarity. Like the Mobius strip.

You might be also interested in Gesture: the audio version: a deconstruction art.