Posts

Showing posts from September, 2018

Future research must be to develop models of language that explain the clustering of characteristics found in sign languages.

Image
  The discovery of these sign language properties brings to light observations about spoken language that seem to be determined by the modality alone, such as the tendency to string linguistic information out linearly, and the generally arbitrary nature of the lexicon and grammar. Before research on sign language, such characteristics were thought to be properties of language universally.  In the light of sign language research, we may suggest that these properties are artefacts of the oral–aural modality. Our work is cut out for us. An important goal of future research must be to develop models of language that explain the clustering of characteristics found in sign languages , as well as the fact that they conform to more general linguistic universals . In developing such models, we are charged with an equally important task: explaining precisely what it is about the spoken language modality that makes it different from sign language. Only by examining these two natural ...

Neural Control of Sign Languages.

Image
  Neurolinguists attempt to gain insight into the nature of language by determining which areas of the brain control various components of the language system. One way in which this is done is by studying the performance of people who have incurred brain damage, such as the localized damage resulting from strokes. In hearing, speaking people, numerous studies have found that damage to certain areas of the left cerebral hemisphere typically results in some type of language dysfunction, called aphasia. There are  Different types of aphasia , in which damage to different areas of the brain results in Different types of language deficits . It is assumed that if a given function y is lost after damage to a certain area A, then that area is involved in the control or processing of that function. In this realm of investigation as well, similarities between signed and spoken languages have been found. Although damage to areas of the left hemisphere results in aphasia in hearing subje...

Recent Challenges.

Image
A context for the future investigation of the relationship between language and cognition is an existing fundamental dispute about the nature of the language faculty . At one extreme is the strong Chomskyan view that language is an “organ,” innately specified, and both computationally and neurologically divorced from other aspects of cognition. Other scholars argue that the experience of the child acquiring language plays a stronger role in determining language form . Jackendoff in 1997 develops a theory according to which language is the outcome of a combination of factors, some specific to language (and possibly genetically specified), and others that tap more general concepts and knowledge about the world. We now turn to certain outstanding questions about sign language that bear on this controversy. When they are seriously addressed, we believe the answers will lead us to a far deeper understanding of the language capacity than would have been possible without sign language re...

Basic, unexplained similarities among sign languages.

Image
  We have argued that sign languages bear important similarities to spoken languages. But we have only hinted at how similar the grammatical structures of sign languages are to one another. As Elissa Newport stressed in an address to sign language researchers , this important generalization needs to be explained. Using various grammatical criteria, linguists sometimes find it instructive to group languages into categories or types. These criteria may be applied at any level of analysis – syntax, morphology, or phonology. For example, some languages have the Swahili type of morphology; others have the Navaho type; etc. In syntax, some languages have Subject-Verb-Object word order ; others have Verb-Subject-Object order , for example. Phonologically, some languages allow several consonants to occur together before a vowel appears; others allow only one consonant at the beginning of a syllable. The point is that spoken languages may fall into one of any of a number of categories at...

Neurological differences.

Image
Aphasia studies show clearly that both spoken and signed languages are controlled in the left hemisphere. Some recent brain research on normal subjects using modern techniques such as positron emission tomography and functional magnetic resonance imaging are suggestive of possible differences in brain organization related to sign language . In particular, recent results find right hemisphere involvement in sign language processing by deaf and hearing native signers . One research group used regional cerebral blood flow and positron emission tomography to examine sign and spoken language representation in hearing native signers (hearing people born to deaf parents who learned sign language as a first language). The two techniques showed bilateral activation (activation in both cerebral hemispheres) for both modalities. Another research group, using event related potentials, similarly found bilateral activation for both modalities, for both deaf and hearing native signers. Using func...