Ever wondered why the Snowdrop programme activities are presented as small 'bite sizedchunks?' Also why our Vygotskian workshop activities are broken down into 'bite sized' task series of simple units of activity.
------------------------------------------------------
In order to comprehend the continuous stream of stimulation that battle for our attention, humans seem to breakdown activities into smaller, more digestible chunks, a phenomenon that psychologists describe as "event structure perception."
Event structure perception was originally believed to be confined to our visual system, but new research published in Psychological Science, a journal of the Association for Psychological Science, reports that a similar process occurs in all sensory systems.
Researchers at Washington University examined event structure perception by studying subjects going about everyday activities while undergoing tests to measure neural activity. The subjects were then invited back a few days later to perform the same tasks, this time withoout their neural activity being measured. Instead, they were asked to divide the task where they believed one segment of an activity ended and another segment began.
The researchers surmised that if changes in neural activity occurred at the same points that the subjects divided the activities, then it could be safe to suggest that humans are physiologically disposed to break down activities into bite sized chunks (remember that the same subjects had no idea during the first part of the experiment that they would later be asked to segment the activity).
As expected, activity in certain areas of the brain increased at the points that subjects had identified as the beginning or end of a segment, otherwise known as an "event boundary." Consistent with previous research, such boundaries tended to occur during transitions in task such as changes of location or a shift in the character's goals. Researchers have hypothesised that people break down activities into smaller chunks when they are involved in an activity. However, this is the first study to demonstrate that this process occurs naturally, without awareness and to identify some of the brain regions that are involved in this process.
These results are particularly important to our understanding of how humans comprehend everyday activity. The researchers suggest that the findings provide evidence not only that people are able to identify the structure of activities, but also that this process of segmenting the activity into discrete events occurs without us being aware of it.
In addition, a subset of the network of brain regions that also responds to event boundaries while subjects view movies of everyday events was activated. It is believed that "this similarity between processing of actual and observed activities may be more than mere coincidence, and may reflect the existence of a general network for understanding event structure.
Friday 27 April 2012
Tuesday 17 April 2012
The Importance of Zinc.
All parents with children on the Snowdrop programme are made aware of the importance of Zinc.
-------------------------------------------
To the multitude of substances that regulate neuronal signaling in the brain and spinal cord add a new key player: zinc. By engineering a mouse with a mutation affecting a neuronal zinc target, researchers have demonstrated a central role for zinc in modulating signaling among the neurons. Significantly, they found the mutant mouse shows the same exaggerated response to noise as children with the genetic disorder called "startle disease," or hyperekplexia.
The findings shed light on a nagging mystery in neurobiology: why the connections among certain types of neurons contain considerable pools of free zinc ions. And even though many studies had shown that zinc can act toxically on transmission of neural impulses, half a century of experiment researchers had not been able to show conclusively that the metal plays a role in normal nerve cell transmission.
However, in an article in the journal Neuron, published by Cell Press, Heinrich Betz and colleagues conclusively demonstrate just such a role for zinc.
In their experiments, the researchers produced mice harboring a mutant form of a gene for a receptor for zinc in neurons--thereby compromising the neurons' ability to respond to zinc. The mutation in the receptor, called the glycine receptor, targets the same receptor known to be mutated in humans with hyperekplexia. The receptor functions as a modulator of neurons in both motor and sensory signaling pathways in the brain and spinal cord.
The genetic approach used by the researchers was a more targeted technique than previous experiments in which researchers reduced overall neuronal zinc levels using chemicals called chelators that soak up zinc ions.
The resulting mutant mice showed tremors, delayed ability to right themselves when turned over, abnormal gait, altered transmission of visual signals, and an enhanced startle response to sudden noise.
Electrophysiological studies of the mutant animals' brain and spinal neurons showed significant zinc-related abnormalities in transmission of signals at the connections, called synapses, among neurons.
Betz and his colleagues wrote that "The data presented in our paper disclose a pivotal role of ambient synaptic [zinc ion] for glycinergic neurotransmission in the context of normal animal behavior." They also concluded that their results implied that manipulating synaptic zinc levels could affect the neuronal action of zinc, but that such manipulation "highlights the complexity of potential therapeutic interventions," which could cause an imbalance between the excitatory and inhibitory circuitry in the central nervous system.
In a preview of the paper in the same issue of Neuron, Alan R. Kay, Jacques Neyton, and Pierre Paoletti wrote "Undoubtedly this work is important, since it directly demonstrates that zinc acts as an endogenous modulator of synaptic transmission." They wrote that the findings "will certainly revive the flagging hopes of zincologists. This work provides a clear demonstration that interfering with zinc modulation of a synaptic pathway leads to a significant alteration in the phenotype of the animal." The three scientists added that the finding "puts a nice dent in the zinc armor, which held firm for more than 50 years."
###
Hirzel et al.: "Hyperekplexia Phenotype of Glycine Receptor a1 Subunit Mutant Mice Identifies Zn2+ as an Essential Endogenous Modulator of Glycinergic Neurotransmission." Publishing inNeuron 52, 679-690, November 22, 2006. DOI 10.1016/j.neuron.2006.09.035
-------------------------------------------
To the multitude of substances that regulate neuronal signaling in the brain and spinal cord add a new key player: zinc. By engineering a mouse with a mutation affecting a neuronal zinc target, researchers have demonstrated a central role for zinc in modulating signaling among the neurons. Significantly, they found the mutant mouse shows the same exaggerated response to noise as children with the genetic disorder called "startle disease," or hyperekplexia.
The findings shed light on a nagging mystery in neurobiology: why the connections among certain types of neurons contain considerable pools of free zinc ions. And even though many studies had shown that zinc can act toxically on transmission of neural impulses, half a century of experiment researchers had not been able to show conclusively that the metal plays a role in normal nerve cell transmission.
However, in an article in the journal Neuron, published by Cell Press, Heinrich Betz and colleagues conclusively demonstrate just such a role for zinc.
In their experiments, the researchers produced mice harboring a mutant form of a gene for a receptor for zinc in neurons--thereby compromising the neurons' ability to respond to zinc. The mutation in the receptor, called the glycine receptor, targets the same receptor known to be mutated in humans with hyperekplexia. The receptor functions as a modulator of neurons in both motor and sensory signaling pathways in the brain and spinal cord.
The genetic approach used by the researchers was a more targeted technique than previous experiments in which researchers reduced overall neuronal zinc levels using chemicals called chelators that soak up zinc ions.
The resulting mutant mice showed tremors, delayed ability to right themselves when turned over, abnormal gait, altered transmission of visual signals, and an enhanced startle response to sudden noise.
Electrophysiological studies of the mutant animals' brain and spinal neurons showed significant zinc-related abnormalities in transmission of signals at the connections, called synapses, among neurons.
Betz and his colleagues wrote that "The data presented in our paper disclose a pivotal role of ambient synaptic [zinc ion] for glycinergic neurotransmission in the context of normal animal behavior." They also concluded that their results implied that manipulating synaptic zinc levels could affect the neuronal action of zinc, but that such manipulation "highlights the complexity of potential therapeutic interventions," which could cause an imbalance between the excitatory and inhibitory circuitry in the central nervous system.
In a preview of the paper in the same issue of Neuron, Alan R. Kay, Jacques Neyton, and Pierre Paoletti wrote "Undoubtedly this work is important, since it directly demonstrates that zinc acts as an endogenous modulator of synaptic transmission." They wrote that the findings "will certainly revive the flagging hopes of zincologists. This work provides a clear demonstration that interfering with zinc modulation of a synaptic pathway leads to a significant alteration in the phenotype of the animal." The three scientists added that the finding "puts a nice dent in the zinc armor, which held firm for more than 50 years."
###
Hirzel et al.: "Hyperekplexia Phenotype of Glycine Receptor a1 Subunit Mutant Mice Identifies Zn2+ as an Essential Endogenous Modulator of Glycinergic Neurotransmission." Publishing inNeuron 52, 679-690, November 22, 2006. DOI 10.1016/j.neuron.2006.09.035
Monday 16 April 2012
Exposure to speech sounds is the basis of speech production.
This is why, in the Snowdrop programme, there are activities designed to give children the maximum exposure to speech sounds. As I say to every family, there is a link in the brain between exposure to language and language production.
-------------------------------------
Experience, as the old saying goes, is the best teacher. And experience seems to play an important early role in how infants learn to understand and produce language.
Using new technology that measures the magnetic field generated by the activation of neurons in the brain, researchers tracked what appears to be a link between the listening and speaking areas of the brain in newborn, 6-month-old and one-year-old infants, before infants can speak.
The study, which appears in this month's issue of the journal NeuroReport, shows that Broca's area, located in the front of the left hemisphere of the brain, is gradually activated during an infant's initial year of life, according to Toshiaki Imada, lead author of the paper and a research professor at the University of Washington's Institute for Brain and Learning Sciences.
Broca's area has long been identified as the seat of speech production and, more recently, as that of social cognition and is critical to language and reading, according to Patricia Kuhl, co-author of the study and co-director of the UW's Institute for Brain and Learning Sciences.
"Magnetoencephalography is perfectly non-invasive and measures the magnetic field generated by neurons in the brain responding to sensory information that then 'leaks' through the skull," said Imada, one of the world's experts in the uses of magnetoencephalography to study the brain.
Kuhl said there is a long history of a link in the adult brain between the areas responsible for understanding and those responsible for speaking language. The link allows children to mimic the speech patterns they hear when they are very young. That's why people from Brooklyn speak "Brooklynese," she said.
"We think the connection between perception and production of speech gets formed by experience, and we are trying to determine when and how babies do it," said Kuhl, who also is a professor of speech and hearing sciences.
The study involved 43 infants in Finland -18 newborns, 17 6-month-olds and 8 one-year olds. Special hardware and software developed for this study allowed the infants' brain activity to be monitored even if they moved and captured brain activation with millisecond precision.
The babies were exposed to three kinds of sounds through earphones - pure tones that do not resemble speech like notes played on a piano, a three-tone harmonic chord that resembles speech and two Finnish syllables, "pa" and "ta." The researchers collected magnetic data only from the left hemisphere of the brain among the newborns because they cannot sit up and the magnetoencephalography cap was too big to securely fit their heads.
At all three ages the infants showed activation in the temporal part of the brain, Broca's area, that is responsible for listening and understanding speech, showing they were able to detect sound changes for all three stimuli. But the pure perception of sound did not activate the areas of the brain responsible for speaking. However, researchers began seeing some activation in Broca's area when the 6-month-old infants heard the syllables or harmonic chords. By the time the infants were one-year old, the speech stimuli activated Broca's area simultaneously with the auditory areas, indicating "cross-talk" between the area of the brain that hears language and the area that produces language, according to Kuhl.
"We think that early in development babies need to play with sounds, just as they play with their hands. And that helps them map relationships between sounds with the movements of their mouth and tongue," she said. "To master a skill, babies have to play and practice just as they later will in learning how to throw a baseball or ride a bike. Babies form brain connections by listening to themselves and linking what they hear to what they did to cause the sounds. Eventually they will use this skill to mimic speakers in their environments."
This playing with language starts, Kuhl said, when babies begin cooing around 12 weeks of age and begin babbling around seven months of age.
"They are cooing and babbling before they know how to link their mouth and tongue movements. This brain connection between perception and production requires experience," she said.
-------------------------------------
Experience, as the old saying goes, is the best teacher. And experience seems to play an important early role in how infants learn to understand and produce language.
Using new technology that measures the magnetic field generated by the activation of neurons in the brain, researchers tracked what appears to be a link between the listening and speaking areas of the brain in newborn, 6-month-old and one-year-old infants, before infants can speak.
The study, which appears in this month's issue of the journal NeuroReport, shows that Broca's area, located in the front of the left hemisphere of the brain, is gradually activated during an infant's initial year of life, according to Toshiaki Imada, lead author of the paper and a research professor at the University of Washington's Institute for Brain and Learning Sciences.
Broca's area has long been identified as the seat of speech production and, more recently, as that of social cognition and is critical to language and reading, according to Patricia Kuhl, co-author of the study and co-director of the UW's Institute for Brain and Learning Sciences.
"Magnetoencephalography is perfectly non-invasive and measures the magnetic field generated by neurons in the brain responding to sensory information that then 'leaks' through the skull," said Imada, one of the world's experts in the uses of magnetoencephalography to study the brain.
Kuhl said there is a long history of a link in the adult brain between the areas responsible for understanding and those responsible for speaking language. The link allows children to mimic the speech patterns they hear when they are very young. That's why people from Brooklyn speak "Brooklynese," she said.
"We think the connection between perception and production of speech gets formed by experience, and we are trying to determine when and how babies do it," said Kuhl, who also is a professor of speech and hearing sciences.
The study involved 43 infants in Finland -18 newborns, 17 6-month-olds and 8 one-year olds. Special hardware and software developed for this study allowed the infants' brain activity to be monitored even if they moved and captured brain activation with millisecond precision.
The babies were exposed to three kinds of sounds through earphones - pure tones that do not resemble speech like notes played on a piano, a three-tone harmonic chord that resembles speech and two Finnish syllables, "pa" and "ta." The researchers collected magnetic data only from the left hemisphere of the brain among the newborns because they cannot sit up and the magnetoencephalography cap was too big to securely fit their heads.
At all three ages the infants showed activation in the temporal part of the brain, Broca's area, that is responsible for listening and understanding speech, showing they were able to detect sound changes for all three stimuli. But the pure perception of sound did not activate the areas of the brain responsible for speaking. However, researchers began seeing some activation in Broca's area when the 6-month-old infants heard the syllables or harmonic chords. By the time the infants were one-year old, the speech stimuli activated Broca's area simultaneously with the auditory areas, indicating "cross-talk" between the area of the brain that hears language and the area that produces language, according to Kuhl.
"We think that early in development babies need to play with sounds, just as they play with their hands. And that helps them map relationships between sounds with the movements of their mouth and tongue," she said. "To master a skill, babies have to play and practice just as they later will in learning how to throw a baseball or ride a bike. Babies form brain connections by listening to themselves and linking what they hear to what they did to cause the sounds. Eventually they will use this skill to mimic speakers in their environments."
This playing with language starts, Kuhl said, when babies begin cooing around 12 weeks of age and begin babbling around seven months of age.
"They are cooing and babbling before they know how to link their mouth and tongue movements. This brain connection between perception and production requires experience," she said.
Saturday 14 April 2012
Music and Cognition.
I was asked by a parent only this week, why music was so important to the Snowdrop programme. Many children with developmental disabilities like cerebral palsy and autism also experience learning difficulties, - the power of music to influence cognition makes it an important facet of any programme of rehabilitation.
A recent volume of the Annals of the New York Academy of Sciences takes a closer look at how music evolved and how we respond to it. Contributors to the volume believe that animals such as birds, dolphins and whales make sounds analogous to music out of a desire to imitate each other. This ability to learn and imitate sounds is a trait necessary to acquire language and scientists feel that many of the sounds animals make may be precursors to human music.
Another study in the volume looks at whether music training can make individuals smarter. Scientists found more grey matter in the auditory cortex of the right hemisphere in musicians compared to nonmusicians. They feel these differences are probably not genetic, but instead due to use and practice.
Listening to classical music, particularly Mozart, has recently been thought to enhance performance on cognitive tests. Contributors to this volume take a closer look at this assertion and their findings indicate that listening to any music that is personally enjoyable has positive effects on cognition. In addition, the use of music to enhance memory is explored and research suggests that musical recitation enhances the coding of information by activating neural networks in a more united and thus more optimal fashion.
Other studies in this volume look at music's positive effects on health and immunity, how music is processed in the brain, the interplay between language and music, and the relationship between our emotions and music.
The Neurosciences and Music II is volume 1060 of the Annals of the New York Academy of Sciences .
A recent volume of the Annals of the New York Academy of Sciences takes a closer look at how music evolved and how we respond to it. Contributors to the volume believe that animals such as birds, dolphins and whales make sounds analogous to music out of a desire to imitate each other. This ability to learn and imitate sounds is a trait necessary to acquire language and scientists feel that many of the sounds animals make may be precursors to human music.
Another study in the volume looks at whether music training can make individuals smarter. Scientists found more grey matter in the auditory cortex of the right hemisphere in musicians compared to nonmusicians. They feel these differences are probably not genetic, but instead due to use and practice.
Listening to classical music, particularly Mozart, has recently been thought to enhance performance on cognitive tests. Contributors to this volume take a closer look at this assertion and their findings indicate that listening to any music that is personally enjoyable has positive effects on cognition. In addition, the use of music to enhance memory is explored and research suggests that musical recitation enhances the coding of information by activating neural networks in a more united and thus more optimal fashion.
Other studies in this volume look at music's positive effects on health and immunity, how music is processed in the brain, the interplay between language and music, and the relationship between our emotions and music.
The Neurosciences and Music II is volume 1060 of the Annals of the New York Academy of Sciences .
Thursday 12 April 2012
Long-term Changes In Experience Cause Neurons To Sprout New Long-lasting Connections
This study not only demonstrates plasticity in action, it also highlights why the repetition of the activities within the Snowdrop programme of rehabilitation for children with developmental disabilities,have to be based upon the long term, in order to encourage dendritic spines to persist and form new circuits.
With thanks to MNT
-------------------------
The researchers said their findings aid understanding of how procedural learning induces long-term rewiring of the brain. This type of learning is used in mastering skills such as riding a bicycle or typing on a computer.
Howard Hughes Medical Institute (HHMI) investigator Karel Svoboda and his colleagues reported their findings in the June 22, 2006, issue of the journal Nature. Other co-authors of the paper included Anthony Holtmaat and Linda Wilbrecht in Svoboda's laboratory at Cold Spring Harbor Laboratory; and Graham Knott and Egbert Welker at the University of Lausanne in Switzerland.
Svoboda is one of a handful of researchers in the world who are pioneering the development of new tools and techniques that permit scientists to observe the brain as it rewires over a period of weeks or months. This summer Svoboda will move to HHMI's Janelia Farm Research Campus where he will pursue neurobiology studies and projects in optics and microscopy.
In the studies reported in Nature, the researchers used mice that were genetically altered to produce a green fluorescent protein in specific neurons in the neocortex, which is a region of the brain that is known to adapt to new experiences. The researchers followed the growth of dendritic spines in the region of the neocortex that processes tactile information from the animals' whiskers. Sensory information from the whiskers is vitally important for mice as they navigate their environment. Consequently, a significant portion of the mouse's brain is devoted to processing input from whiskers.
To monitor changes in neuronal structure visually, the researchers used a novel technique that Svoboda's team had developed earlier. The scientists employed laser-scanning microscopy aimed through a small glass window in the animals' skulls to image changes in fluorescence that revealed dendritic alterations present on the neurons.
In this most recent experiment, Svoboda's team was looking to see if persistent changes in connectivity occurred after a long-term change in experience. They focused on the formation of new long-lasting spines that indicate robust connections between a neuron's highly branched dendrites and nearby axons. Dendrites are the input side of neurons and axons the output side. Spines stipple the surface of dendrites, like twigs from a branch, and form the receiving ends of synapses, which are the junctions between neurons where neurotransmitters are released.
The researchers induced a long-term change in the sensory experience of mice by trimming the animals' whiskers, selectively cutting some but leaving neighboring whiskers, in a chessboard pattern. Over time, this selective grooming caused the animals' neurons to rewire themselves to adapt to loss of the whiskers - a strategy that reduces dependence on lost whiskers and enhances input from intact whiskers.
"We knew from previous work that a subpopulation of dendritic spines appears and disappears over time, but we didn't know the biological meaning of this process," said Svoboda. "In these experiments, we found that over about a month new spines appeared, and in particular a subset of these new spines stabilized to form new circuits," he said.
He said that the researchers were most excited to find that after this long-term change in experience there were many new spines that were large, robust and "persistent," meaning that they remained over long periods of time. "Under normal conditions, most new spines are faint and small, and disappear after a couple of days," said Svoboda. "What we think they are doing is reaching out and probing for potential neuronal partners. And while most of them retract, with novel sensory experience some populations of these new spines make connections that constitute beneficial new wiring. So those spines gain bulk and are stabilized as persistent spines."
In addition to observing the gain and loss of persistent spines, the researchers also used electron microscopy to examine the new spines in more detail. That study showed that the new persistent spines participate in synapses, said Svoboda.
Thursday 5 April 2012
Confirmation of 'written word recognition area' in the brain.
This is one of the reasons why the Snowdrop reading programme focuses not only on building phonological awareness, auditory attention skills, grapheme to phoneme correcspondence, etc, but also provides stimuli in the form of 'whole word flashcards.'
With thanks to Medical News Today.
-------------------------------------------------
Humans have an uncanny ability to skim through text, instantly recognizing words by their shape--even though writing developed only about 6000 years ago--long after humans evolved. Thus, neuroscientists have hotly debated whether an area of the cortex called the Visual Word Form Area (VWFA) is truly a specific and necessary area for recognizing words.
Functional MRI scans have shown that the area specifically activates when people read, as opposed to recognizing other objects, such as faces or houses. And people with lesions in the region lose the ability to recognize whole words--reduced to letter-by-letter reading. However, fMRI studies cannot demonstrate a causal role for the VWFA, and lesions involving the VWFA invariably involved other regions as well.
Now, a patient whose surgery to relieve epilepsy specifically disrupted the VWFA has given researchers, led by Laurent Cohen of the Hopital de la Salpetriere, an opportunity to demonstrate that the region does indeed play a causal role in the ability to recognize words.
The researchers reported in the April 2006, issue of Neuron the results of reading, language, and object recognition tests both before and after the surgery on the 46-year-old man. They found his reading capability before surgery to be normal. However, tests after surgery showed very different results.
"Although we studied reading more extensively than the perception of other types of visual stimuli, our patient presented a clear-cut reading impairment following surgery, while his performance remained flawless in object recognition and naming, face processing, and general language abilities," reported the researchers. "Such selectivity may be difficult to observe in patients with more customary lesions resulting from strokes or tumors, which often affect a larger extent of cortex and white matter. The small size of the present lesion thus provides precious support to the idea of partial regional selectivity for word perception in the ventral cortex," they wrote.
Importantly, the researchers observed that before the surgery, the patient could recognize long words as quickly as short ones; but after the surgery, the recognition time increased linearly as a function of word length. Such findings indicated that the patient had been reduced to recognizing words letter by letter.
"How could there be a piece of neural tissue dedicated to a recently invented cognitive skill like word recognition?" wondered Alex Martin in a preview of the paper in the same issue of Neuron. Nevertheless, Martin commented, Cohen and his colleagues "report a unique set of findings in favor of the existence of the VWFA that will surely add fuel to the debate." He concluded that "The single case study…provides compelling evidence that the VWFA plays a causal role in the chain of neural events that underlie normal reading."
With thanks to Medical News Today.
-------------------------------------------------
Humans have an uncanny ability to skim through text, instantly recognizing words by their shape--even though writing developed only about 6000 years ago--long after humans evolved. Thus, neuroscientists have hotly debated whether an area of the cortex called the Visual Word Form Area (VWFA) is truly a specific and necessary area for recognizing words.
Functional MRI scans have shown that the area specifically activates when people read, as opposed to recognizing other objects, such as faces or houses. And people with lesions in the region lose the ability to recognize whole words--reduced to letter-by-letter reading. However, fMRI studies cannot demonstrate a causal role for the VWFA, and lesions involving the VWFA invariably involved other regions as well.
Now, a patient whose surgery to relieve epilepsy specifically disrupted the VWFA has given researchers, led by Laurent Cohen of the Hopital de la Salpetriere, an opportunity to demonstrate that the region does indeed play a causal role in the ability to recognize words.
The researchers reported in the April 2006, issue of Neuron the results of reading, language, and object recognition tests both before and after the surgery on the 46-year-old man. They found his reading capability before surgery to be normal. However, tests after surgery showed very different results.
"Although we studied reading more extensively than the perception of other types of visual stimuli, our patient presented a clear-cut reading impairment following surgery, while his performance remained flawless in object recognition and naming, face processing, and general language abilities," reported the researchers. "Such selectivity may be difficult to observe in patients with more customary lesions resulting from strokes or tumors, which often affect a larger extent of cortex and white matter. The small size of the present lesion thus provides precious support to the idea of partial regional selectivity for word perception in the ventral cortex," they wrote.
Importantly, the researchers observed that before the surgery, the patient could recognize long words as quickly as short ones; but after the surgery, the recognition time increased linearly as a function of word length. Such findings indicated that the patient had been reduced to recognizing words letter by letter.
"How could there be a piece of neural tissue dedicated to a recently invented cognitive skill like word recognition?" wondered Alex Martin in a preview of the paper in the same issue of Neuron. Nevertheless, Martin commented, Cohen and his colleagues "report a unique set of findings in favor of the existence of the VWFA that will surely add fuel to the debate." He concluded that "The single case study…provides compelling evidence that the VWFA plays a causal role in the chain of neural events that underlie normal reading."
Tuesday 3 April 2012
Ability for Grammar 'Hardwired' into Humans.
This is not too far away from Chomsky's idea of a 'Language Acquisition Device' which is hardwired into the human brain, which contains all 250 speech sounds which exist in every language. Exposure to a specific language, then stimulates the retention of the speech sounds within that language and the sounds which the child has no exposure to are dropped. The fact that these Nicaraguan boys were not exposed to language and therefore did not develop any spoken language supports this view. The fact that their sign system contained common grammatical components also supports Chomsky's notion that there is a 'deep structure' which is common to all languages. Food for thought.
---------------------------------
"Our findings suggest that certain fundamental characteristics of human language systems appear in gestural communication, even when the user has never been exposed to linguistic input and has not descended from previous generations of skilled communicative partners," says Elissa L. Newport, George Eastman Professor of Brain and Cognitive Sciences and Linguistics at the University of Rochester. "We examined a particular hallmark of known grammatical systems and found that these signers also used this same hallmark in their gestured sentences. They designed their own language and wound up with some of the same rules of grammar every other language uses."
For eight years, Newport and Marie Coppola, a post-doctoral student at the University of Chicago, studied three deaf Nicaraguan boys who had no exposure to any sign formal language. They were linguistically separated from spoken language by virtue of their complete deafness since birth; separated from knowledge of Nicaraguan Sign Language because they'd never had contact with another signer; and separated from written Spanish since they had little or no formal education. This isolation forced each of the three boys to develop their own gestural-based language, called 'home sign systems' in the field of sign language research. These three isolated languages gave Coppola and Newport a window into how the brain creates language.
The home signers watched 66 very short videos consisting of single actions, such as a woman walking or a man smelling flowers. Using their home sign, they explained what they had seen. All three home signers consistently used the grammatical construction of "subject" in the same form it is used throughout languages around the world.
The concept of "subject" is ubiquitous in language, but is complex and difficult to define. Language assigns concepts to symbols, but does so imperfectly--a noun is usually an object, but certainly not always, as the noun "liberty" demonstrates. A prominent example of this abstract property of language is the idea of subject. While grammar school teachers might explain that a subject is the person, place or thing that performs the action in the sentence, in fact subjects are not necessarily the one who produces or instigates an action.
For instance, in the sentence, "John opened the door," the subject is "John"; but in "The door opened," the word "door" has become the subject, and in "John got hit," the word "John" is the subject even though he is the recipient of the action. Despite having to essentially design their own languages without influence from any other speakers or signers of an established language, the home signers created a complex grammatical component and used it in the same way highly evolved languages do. That the idea of "subject" exists in these individuals and is used in the same manner, strongly suggests that this basic and somewhat arbitrary property of language is an innate tendency in humans as they develop any communication system.
"The notion of 'subject' does not appear to require either linguistic input or a lengthy history within a language to develop," says Newport. "We're starting to see that the grammatical concept of 'subject' is part of the bedrock on which languages form."
Newport is continuing her research into other aspects of linguistics to see what else may be innate in human language, and also how language input alters and expands these innate tendencies.
Subscribe to:
Posts (Atom)