We are constantly taking in information from the environment through our
senses. It is something we cannot help but do and we use this sensory
information to each construct our version of reality. But what is
reality?
None of us really have any idea what reality is actually
like: all we have is a limited sensory system, which interprets visual,
auditory and tactile information and relays it to our conscious
awareness. But people can only iterpret a small part of reality, being
unable to detect, for example, radiation or broad colors on the light
spectrum.
This is one reason why there is folly in totally accepting
the world your senses provide you with. But there is another reason, one
that you have more direct control over: the sensitisation of your
reticular system and what it means for how you experience life on a
daily basis.
The general rule of your reticular system is that
whatever dominates your thoughts - both conscious and unconscious - will
also dominate your attention, whether you like it or not. Ever had a
toothache and then noticed that there seem to be an awful lot of adverts
on TV about toothpaste and dentists? This is your reticular system at
work. When a mother has a baby, she becomes acutely aware, even in
sleep, of every noise her baby makes. - This is her reticular system at
work, - tuning attention to what is dominating her thought processes.
Now
let's consider what happens when the functioning of the reticular
system is not as it should be. Many children suffer from sensory
oversensitivity, whether it be visual, auditory or tactile; - or all
three! This might present itself as a general oversensitivity in the
affected modality, or a more specific oversensitivity, such as being
oversensitive to specific sights, sounds and / or sensations. This is
again the work of the reticular system, (inconjunction with the
thalamus) Because of a dysfunction within the brain, whether caused by
genetics or brain injury, the reticular system of the child becomes
sensitised to particular stimulus, whether visual or auditory, etc and
works in conjunction with the thalamus to excite the cortex so that the
stimulus is processed. However, because of the dysfunctional reticular
system, the cortex becomes over-excited and the child, not understanding
why the stimulus is triggering this reaction in his system, reacts
wildly. Here we have the basis for sensory oversensitivity in many types
of developmental disability, including cerebral palsy, autism and Asperger's syndrome. or any other type of brain injury.
Fortunately,
these neurological structures can be re-tuned, as they constantly are
in uninjured human being, as our awareness and attention are constantly
redirected to salient features of our environment. Snowdrop
has developed techniques to help children who suffer from this type of
difficulty to re-tune the dysfunctional reticular formation, thus
allowing the opportunity for normal developmental processes to resume.
If you would like more information about Snowdrop's treatment programmes for brain injury, visit http://www.snowdrop.cc
Saturday, 21 July 2012
Wednesday, 11 July 2012
Study Shows the Deaf Brain Processes Touch Differently
This study again highlights the brains' adaptability. It demonstrates not only the 'rewiring' phenomenon we see in our children as a result of their participation in the Snowdrop programme, but the fact that areas of the brain previously thought to be specialised for specific functions can adapt and take on other functions.
http://neurosciencenews.com/study-shows-the-deaf-brain-processes-touch-differently/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+neuroscience-rss-feeds-neuroscience-news+%28Neuroscience+News+Updates%29
http://neurosciencenews.com/study-shows-the-deaf-brain-processes-touch-differently/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+neuroscience-rss-feeds-neuroscience-news+%28Neuroscience+News+Updates%29
Friday, 29 June 2012
The Brains' of Children with Autism are Wired Differently.
Research into how the brain is connected in a different way in children with autism. What this study doesn't tell you is that 'wiring patterns' in the brain can be changed. The brain responds mainly to two things, - genetic instruction, (faulty genetic instruction can cause a faulty wiring pattern) and the stimuli it receives from the environment. The environment is by far the most powerful force and the stimulation from it can be manipulated so as to encourage the brain to change. This is what the Snowdrop programme is all about.
-------------------------------------------------------------------------------
A research team led by Elizabeth Aylward, a University of Washington professor of radiology, report that brains of adults with autism are “wired” differently from people without the disorder. The researchers, who are affiliated with the University of Washington’s Autism Center, also found that this abnormal connection pattern may be the cause of the social impairments characteristic of autism in children.
The research team used functional magnetic resonance imaging in the study, which also revealed that the subjects with the most severe social impairment showed the most abnormal pattern of activity of connectivity in the brain regions that process faces. One of the earliest characteristics to emerge in autistic children is a deficit in face processing, and this study is the first to examine how the brain processes information about faces.
Lead author Natalia Kleinhans states that "This study shows that these brain regions are failing to work together efficiently" and that the “work seems to indicate that the brain pathways of people with autism are not completely disconnected, but they are not as strong as in people without autism."
The study’s participants were 19 high-functioning autistic adults from ages 18 to 44 with IQs of at least 85 and 21 age- and intelligence-matched typically developed adults. Within the autism spectrum disorder group were 8 individuals diagnosed with autism, 9 diagnosed with Asperger's syndrome, and 2 with an otherwise non-specified pervasive developmental disorder. Levels of social impairment were drawn from clinical observations and diagnoses.
Participants were shown 4 series of 12 pictures of faces and a similar series of pictures of houses, all while having their brains scanned. The pictures were viewed for 3 seconds, and occasionally they were repeated. The participants were instructed to press a button when a picture was repeated.
Because this was a basic task, the two groups’ performances revealed no difference in performance, but, according to co-author Todd Richards, “Differences might have shown up if they had been asked to do something more complicated."
While there was no difference in performance, the two groups exhibited different patterns of brain activity. The typically developing adults showed significantly more connectivity between the area of the brain involved in face identification and two other areas of the brain than did the autism group.
Those autistic participants with the largest social impairment demonstrated the lowest level of connectivity between the areas of the brain, leading the authors to conclude that "This study shows that the brains of people with autism are not working as cohesively as those of people without autism when they are looking at faces and processing information about them."
-------------------------------------------------------------------------------
A research team led by Elizabeth Aylward, a University of Washington professor of radiology, report that brains of adults with autism are “wired” differently from people without the disorder. The researchers, who are affiliated with the University of Washington’s Autism Center, also found that this abnormal connection pattern may be the cause of the social impairments characteristic of autism in children.
The research team used functional magnetic resonance imaging in the study, which also revealed that the subjects with the most severe social impairment showed the most abnormal pattern of activity of connectivity in the brain regions that process faces. One of the earliest characteristics to emerge in autistic children is a deficit in face processing, and this study is the first to examine how the brain processes information about faces.
Lead author Natalia Kleinhans states that "This study shows that these brain regions are failing to work together efficiently" and that the “work seems to indicate that the brain pathways of people with autism are not completely disconnected, but they are not as strong as in people without autism."
The study’s participants were 19 high-functioning autistic adults from ages 18 to 44 with IQs of at least 85 and 21 age- and intelligence-matched typically developed adults. Within the autism spectrum disorder group were 8 individuals diagnosed with autism, 9 diagnosed with Asperger's syndrome, and 2 with an otherwise non-specified pervasive developmental disorder. Levels of social impairment were drawn from clinical observations and diagnoses.
Participants were shown 4 series of 12 pictures of faces and a similar series of pictures of houses, all while having their brains scanned. The pictures were viewed for 3 seconds, and occasionally they were repeated. The participants were instructed to press a button when a picture was repeated.
Because this was a basic task, the two groups’ performances revealed no difference in performance, but, according to co-author Todd Richards, “Differences might have shown up if they had been asked to do something more complicated."
While there was no difference in performance, the two groups exhibited different patterns of brain activity. The typically developing adults showed significantly more connectivity between the area of the brain involved in face identification and two other areas of the brain than did the autism group.
Those autistic participants with the largest social impairment demonstrated the lowest level of connectivity between the areas of the brain, leading the authors to conclude that "This study shows that the brains of people with autism are not working as cohesively as those of people without autism when they are looking at faces and processing information about them."
Does this research mean that children with autism need to be 'stuck' with this connectivity problem? This is not what I am finding. We know that the brain has qualities of plasticity, - that it is capable of re-organising it's structure and functioning through environmental stimulation. We know that this plasticity is achieved through 'sprouting' - that is the forming of new synaptic connections through dendritic growth in response to this environmental stimulation. As I said at the beginning of this post, this means that the faulty wiring pattern which the brains of children with autism adopts can be changed. The question is, how do we do this? At Snowdrop, I do this by providing the child with an enriched developmental environment which provides stimulation appropriate to the child's sensory and cognitive needs. In the particular instance of poor face recognition processing, we can utilise specialised techniques to enhance the abilities of children to process information concerning faces. Very often this leads to greater eye - contact and better facial regard and the development of mutual attention. As these abilities underpin both language and social development, we can also see improvements in these areas.
Friday, 8 June 2012
Music and Language are Processed By Some of the Same Brain Systems
This is further justification for the use of music as a tool for treatment within the Snowdrop programme, both generally and using such tools as 'The Listening Programme' of which Snowdrop is a providor.
With thanks to MNT
-----------------------------------------------------------------
Researchers have long debated whether or not language and music depend on common processes in the mind. Now, researchers at Georgetown University Medical Center have found evidence that the processing of music and language do indeed depend on some of the same brain systems.
Their findings, which are currently available on-line and will be published later this year in the journal NeuroImage, are the first to suggest that two different aspects of both music and language depend on the same two memory systems in the brain. One brain system, based in the temporal lobes, helps humans memorize information in both language and music -- for example, words and meanings in language and familiar melodies in music. The other system, based in the frontal lobes, helps us unconsciously learn and use the rules that underlie both language and music, such as the rules of syntax in sentences, and the rules of harmony in music.
"Up until now, researchers had found that the processing of rules relies on an overlapping set of frontal lobe structures in music and language. However, in addition to rules, both language and music crucially require the memorization of arbitrary information such as words and melodies," says the study's principal investigator, Michael Ullman, Ph.D., professor of neuroscience, psychology, neurology and linguistics.
"This study not only confirms that one set of brain structures underlies rules in both language and music, but also suggests, for the first time, that a different brain system underlies memorized information in both domains," Ullman says. "So language and music both depend on two different brain systems, each for the same type of thing -- rules in one case, and arbitrary information in the other."
Robbin Miranda, Ph.D., currently a post-doctoral researcher in the Department of Neuroscience, carried out this research with Ullman for her graduate dissertation at Georgetown. They enrolled 64 adults. They used a technique called Event-Related Potentials, in which they measured the brain's electrical activity using electrodes placed on the scalp.
The subjects listened to 180 snippets of melodies. Half of the melodies were segments from tunes that most participants would know, such as "Three Blind Mice" and "Twinkle, Twinkle Little Star." The other half included novel tunes composed by Miranda. Three versions of each well-known and novel melody were created: melodies containing an in-key deviant note (which could only be detected if the melody was familiar, and therefore memorized); melodies that contained an out-of-key deviant note (which violated rules of harmony); and the original (control) melodies.
For listeners familiar with a melody, an in-key deviant note violated the listener's memory of the melody -- the song sounded musically "correct" and didn't violate any rules of music, but it was different than what the listener had previously memorized. In contrast, in-key "deviant" notes in novel melodies did not violate memory (or rules) because the listeners did not know the tune.
Out-of-key deviant notes constituted violations of musical rules in both well-known and novel melodies. Additionally, out-of-key deviant notes violated memory in well-known melodies.
Miranda and Ullman examined the brain waves of the participants who listened to melodies in the different conditions, and found that violations of rules and memory in music corresponded to the two patterns of brain waves seen in previous studies of rule and memory violations in language. That is, in-key violations of familiar (but not novel) melodies led to a brain-wave pattern similar to one called an "N400" that has previously been found with violations of words (such as, "I'll have my coffee with milk and concrete"). Out-of-key violations of both familiar and novel melodies led to a brain-wave pattern over frontal lobe electrodes similar to patterns previously found for violations of rules in both language and music. Finally, out-of-key violations of familiar melodies also led to an N400-like pattern of brain activity, as expected because these are violations of memory as well as rules.
"This tells us that these two aspects of music, that is rules and memorized melodies, depend on two different brain systems -- brain systems that also underlie rules and memorized information in language," Ullman says. "The findings open up exciting new ways of thinking about and investigating the relationship between language and music, two fundamental human capacities."
Friday, 27 April 2012
The Brain Prefers Small 'Chunks' of Information.
Ever wondered why the Snowdrop programme activities are presented as small 'bite sizedchunks?' Also why our Vygotskian workshop activities are broken down into 'bite sized' task series of simple units of activity.
------------------------------------------------------
In order to comprehend the continuous stream of stimulation that battle for our attention, humans seem to breakdown activities into smaller, more digestible chunks, a phenomenon that psychologists describe as "event structure perception."
Event structure perception was originally believed to be confined to our visual system, but new research published in Psychological Science, a journal of the Association for Psychological Science, reports that a similar process occurs in all sensory systems.
Researchers at Washington University examined event structure perception by studying subjects going about everyday activities while undergoing tests to measure neural activity. The subjects were then invited back a few days later to perform the same tasks, this time withoout their neural activity being measured. Instead, they were asked to divide the task where they believed one segment of an activity ended and another segment began.
The researchers surmised that if changes in neural activity occurred at the same points that the subjects divided the activities, then it could be safe to suggest that humans are physiologically disposed to break down activities into bite sized chunks (remember that the same subjects had no idea during the first part of the experiment that they would later be asked to segment the activity).
As expected, activity in certain areas of the brain increased at the points that subjects had identified as the beginning or end of a segment, otherwise known as an "event boundary." Consistent with previous research, such boundaries tended to occur during transitions in task such as changes of location or a shift in the character's goals. Researchers have hypothesised that people break down activities into smaller chunks when they are involved in an activity. However, this is the first study to demonstrate that this process occurs naturally, without awareness and to identify some of the brain regions that are involved in this process.
These results are particularly important to our understanding of how humans comprehend everyday activity. The researchers suggest that the findings provide evidence not only that people are able to identify the structure of activities, but also that this process of segmenting the activity into discrete events occurs without us being aware of it.
In addition, a subset of the network of brain regions that also responds to event boundaries while subjects view movies of everyday events was activated. It is believed that "this similarity between processing of actual and observed activities may be more than mere coincidence, and may reflect the existence of a general network for understanding event structure.
------------------------------------------------------
In order to comprehend the continuous stream of stimulation that battle for our attention, humans seem to breakdown activities into smaller, more digestible chunks, a phenomenon that psychologists describe as "event structure perception."
Event structure perception was originally believed to be confined to our visual system, but new research published in Psychological Science, a journal of the Association for Psychological Science, reports that a similar process occurs in all sensory systems.
Researchers at Washington University examined event structure perception by studying subjects going about everyday activities while undergoing tests to measure neural activity. The subjects were then invited back a few days later to perform the same tasks, this time withoout their neural activity being measured. Instead, they were asked to divide the task where they believed one segment of an activity ended and another segment began.
The researchers surmised that if changes in neural activity occurred at the same points that the subjects divided the activities, then it could be safe to suggest that humans are physiologically disposed to break down activities into bite sized chunks (remember that the same subjects had no idea during the first part of the experiment that they would later be asked to segment the activity).
As expected, activity in certain areas of the brain increased at the points that subjects had identified as the beginning or end of a segment, otherwise known as an "event boundary." Consistent with previous research, such boundaries tended to occur during transitions in task such as changes of location or a shift in the character's goals. Researchers have hypothesised that people break down activities into smaller chunks when they are involved in an activity. However, this is the first study to demonstrate that this process occurs naturally, without awareness and to identify some of the brain regions that are involved in this process.
These results are particularly important to our understanding of how humans comprehend everyday activity. The researchers suggest that the findings provide evidence not only that people are able to identify the structure of activities, but also that this process of segmenting the activity into discrete events occurs without us being aware of it.
In addition, a subset of the network of brain regions that also responds to event boundaries while subjects view movies of everyday events was activated. It is believed that "this similarity between processing of actual and observed activities may be more than mere coincidence, and may reflect the existence of a general network for understanding event structure.
Tuesday, 17 April 2012
The Importance of Zinc.
All parents with children on the Snowdrop programme are made aware of the importance of Zinc.
-------------------------------------------
To the multitude of substances that regulate neuronal signaling in the brain and spinal cord add a new key player: zinc. By engineering a mouse with a mutation affecting a neuronal zinc target, researchers have demonstrated a central role for zinc in modulating signaling among the neurons. Significantly, they found the mutant mouse shows the same exaggerated response to noise as children with the genetic disorder called "startle disease," or hyperekplexia.
The findings shed light on a nagging mystery in neurobiology: why the connections among certain types of neurons contain considerable pools of free zinc ions. And even though many studies had shown that zinc can act toxically on transmission of neural impulses, half a century of experiment researchers had not been able to show conclusively that the metal plays a role in normal nerve cell transmission.
However, in an article in the journal Neuron, published by Cell Press, Heinrich Betz and colleagues conclusively demonstrate just such a role for zinc.
In their experiments, the researchers produced mice harboring a mutant form of a gene for a receptor for zinc in neurons--thereby compromising the neurons' ability to respond to zinc. The mutation in the receptor, called the glycine receptor, targets the same receptor known to be mutated in humans with hyperekplexia. The receptor functions as a modulator of neurons in both motor and sensory signaling pathways in the brain and spinal cord.
The genetic approach used by the researchers was a more targeted technique than previous experiments in which researchers reduced overall neuronal zinc levels using chemicals called chelators that soak up zinc ions.
The resulting mutant mice showed tremors, delayed ability to right themselves when turned over, abnormal gait, altered transmission of visual signals, and an enhanced startle response to sudden noise.
Electrophysiological studies of the mutant animals' brain and spinal neurons showed significant zinc-related abnormalities in transmission of signals at the connections, called synapses, among neurons.
Betz and his colleagues wrote that "The data presented in our paper disclose a pivotal role of ambient synaptic [zinc ion] for glycinergic neurotransmission in the context of normal animal behavior." They also concluded that their results implied that manipulating synaptic zinc levels could affect the neuronal action of zinc, but that such manipulation "highlights the complexity of potential therapeutic interventions," which could cause an imbalance between the excitatory and inhibitory circuitry in the central nervous system.
In a preview of the paper in the same issue of Neuron, Alan R. Kay, Jacques Neyton, and Pierre Paoletti wrote "Undoubtedly this work is important, since it directly demonstrates that zinc acts as an endogenous modulator of synaptic transmission." They wrote that the findings "will certainly revive the flagging hopes of zincologists. This work provides a clear demonstration that interfering with zinc modulation of a synaptic pathway leads to a significant alteration in the phenotype of the animal." The three scientists added that the finding "puts a nice dent in the zinc armor, which held firm for more than 50 years."
###
Hirzel et al.: "Hyperekplexia Phenotype of Glycine Receptor a1 Subunit Mutant Mice Identifies Zn2+ as an Essential Endogenous Modulator of Glycinergic Neurotransmission." Publishing inNeuron 52, 679-690, November 22, 2006. DOI 10.1016/j.neuron.2006.09.035
-------------------------------------------
To the multitude of substances that regulate neuronal signaling in the brain and spinal cord add a new key player: zinc. By engineering a mouse with a mutation affecting a neuronal zinc target, researchers have demonstrated a central role for zinc in modulating signaling among the neurons. Significantly, they found the mutant mouse shows the same exaggerated response to noise as children with the genetic disorder called "startle disease," or hyperekplexia.
The findings shed light on a nagging mystery in neurobiology: why the connections among certain types of neurons contain considerable pools of free zinc ions. And even though many studies had shown that zinc can act toxically on transmission of neural impulses, half a century of experiment researchers had not been able to show conclusively that the metal plays a role in normal nerve cell transmission.
However, in an article in the journal Neuron, published by Cell Press, Heinrich Betz and colleagues conclusively demonstrate just such a role for zinc.
In their experiments, the researchers produced mice harboring a mutant form of a gene for a receptor for zinc in neurons--thereby compromising the neurons' ability to respond to zinc. The mutation in the receptor, called the glycine receptor, targets the same receptor known to be mutated in humans with hyperekplexia. The receptor functions as a modulator of neurons in both motor and sensory signaling pathways in the brain and spinal cord.
The genetic approach used by the researchers was a more targeted technique than previous experiments in which researchers reduced overall neuronal zinc levels using chemicals called chelators that soak up zinc ions.
The resulting mutant mice showed tremors, delayed ability to right themselves when turned over, abnormal gait, altered transmission of visual signals, and an enhanced startle response to sudden noise.
Electrophysiological studies of the mutant animals' brain and spinal neurons showed significant zinc-related abnormalities in transmission of signals at the connections, called synapses, among neurons.
Betz and his colleagues wrote that "The data presented in our paper disclose a pivotal role of ambient synaptic [zinc ion] for glycinergic neurotransmission in the context of normal animal behavior." They also concluded that their results implied that manipulating synaptic zinc levels could affect the neuronal action of zinc, but that such manipulation "highlights the complexity of potential therapeutic interventions," which could cause an imbalance between the excitatory and inhibitory circuitry in the central nervous system.
In a preview of the paper in the same issue of Neuron, Alan R. Kay, Jacques Neyton, and Pierre Paoletti wrote "Undoubtedly this work is important, since it directly demonstrates that zinc acts as an endogenous modulator of synaptic transmission." They wrote that the findings "will certainly revive the flagging hopes of zincologists. This work provides a clear demonstration that interfering with zinc modulation of a synaptic pathway leads to a significant alteration in the phenotype of the animal." The three scientists added that the finding "puts a nice dent in the zinc armor, which held firm for more than 50 years."
###
Hirzel et al.: "Hyperekplexia Phenotype of Glycine Receptor a1 Subunit Mutant Mice Identifies Zn2+ as an Essential Endogenous Modulator of Glycinergic Neurotransmission." Publishing inNeuron 52, 679-690, November 22, 2006. DOI 10.1016/j.neuron.2006.09.035
Monday, 16 April 2012
Exposure to speech sounds is the basis of speech production.
This is why, in the Snowdrop programme, there are activities designed to give children the maximum exposure to speech sounds. As I say to every family, there is a link in the brain between exposure to language and language production.
-------------------------------------
Experience, as the old saying goes, is the best teacher. And experience seems to play an important early role in how infants learn to understand and produce language.
Using new technology that measures the magnetic field generated by the activation of neurons in the brain, researchers tracked what appears to be a link between the listening and speaking areas of the brain in newborn, 6-month-old and one-year-old infants, before infants can speak.
The study, which appears in this month's issue of the journal NeuroReport, shows that Broca's area, located in the front of the left hemisphere of the brain, is gradually activated during an infant's initial year of life, according to Toshiaki Imada, lead author of the paper and a research professor at the University of Washington's Institute for Brain and Learning Sciences.
Broca's area has long been identified as the seat of speech production and, more recently, as that of social cognition and is critical to language and reading, according to Patricia Kuhl, co-author of the study and co-director of the UW's Institute for Brain and Learning Sciences.
"Magnetoencephalography is perfectly non-invasive and measures the magnetic field generated by neurons in the brain responding to sensory information that then 'leaks' through the skull," said Imada, one of the world's experts in the uses of magnetoencephalography to study the brain.
Kuhl said there is a long history of a link in the adult brain between the areas responsible for understanding and those responsible for speaking language. The link allows children to mimic the speech patterns they hear when they are very young. That's why people from Brooklyn speak "Brooklynese," she said.
"We think the connection between perception and production of speech gets formed by experience, and we are trying to determine when and how babies do it," said Kuhl, who also is a professor of speech and hearing sciences.
The study involved 43 infants in Finland -18 newborns, 17 6-month-olds and 8 one-year olds. Special hardware and software developed for this study allowed the infants' brain activity to be monitored even if they moved and captured brain activation with millisecond precision.
The babies were exposed to three kinds of sounds through earphones - pure tones that do not resemble speech like notes played on a piano, a three-tone harmonic chord that resembles speech and two Finnish syllables, "pa" and "ta." The researchers collected magnetic data only from the left hemisphere of the brain among the newborns because they cannot sit up and the magnetoencephalography cap was too big to securely fit their heads.
At all three ages the infants showed activation in the temporal part of the brain, Broca's area, that is responsible for listening and understanding speech, showing they were able to detect sound changes for all three stimuli. But the pure perception of sound did not activate the areas of the brain responsible for speaking. However, researchers began seeing some activation in Broca's area when the 6-month-old infants heard the syllables or harmonic chords. By the time the infants were one-year old, the speech stimuli activated Broca's area simultaneously with the auditory areas, indicating "cross-talk" between the area of the brain that hears language and the area that produces language, according to Kuhl.
"We think that early in development babies need to play with sounds, just as they play with their hands. And that helps them map relationships between sounds with the movements of their mouth and tongue," she said. "To master a skill, babies have to play and practice just as they later will in learning how to throw a baseball or ride a bike. Babies form brain connections by listening to themselves and linking what they hear to what they did to cause the sounds. Eventually they will use this skill to mimic speakers in their environments."
This playing with language starts, Kuhl said, when babies begin cooing around 12 weeks of age and begin babbling around seven months of age.
"They are cooing and babbling before they know how to link their mouth and tongue movements. This brain connection between perception and production requires experience," she said.
-------------------------------------
Experience, as the old saying goes, is the best teacher. And experience seems to play an important early role in how infants learn to understand and produce language.
Using new technology that measures the magnetic field generated by the activation of neurons in the brain, researchers tracked what appears to be a link between the listening and speaking areas of the brain in newborn, 6-month-old and one-year-old infants, before infants can speak.
The study, which appears in this month's issue of the journal NeuroReport, shows that Broca's area, located in the front of the left hemisphere of the brain, is gradually activated during an infant's initial year of life, according to Toshiaki Imada, lead author of the paper and a research professor at the University of Washington's Institute for Brain and Learning Sciences.
Broca's area has long been identified as the seat of speech production and, more recently, as that of social cognition and is critical to language and reading, according to Patricia Kuhl, co-author of the study and co-director of the UW's Institute for Brain and Learning Sciences.
"Magnetoencephalography is perfectly non-invasive and measures the magnetic field generated by neurons in the brain responding to sensory information that then 'leaks' through the skull," said Imada, one of the world's experts in the uses of magnetoencephalography to study the brain.
Kuhl said there is a long history of a link in the adult brain between the areas responsible for understanding and those responsible for speaking language. The link allows children to mimic the speech patterns they hear when they are very young. That's why people from Brooklyn speak "Brooklynese," she said.
"We think the connection between perception and production of speech gets formed by experience, and we are trying to determine when and how babies do it," said Kuhl, who also is a professor of speech and hearing sciences.
The study involved 43 infants in Finland -18 newborns, 17 6-month-olds and 8 one-year olds. Special hardware and software developed for this study allowed the infants' brain activity to be monitored even if they moved and captured brain activation with millisecond precision.
The babies were exposed to three kinds of sounds through earphones - pure tones that do not resemble speech like notes played on a piano, a three-tone harmonic chord that resembles speech and two Finnish syllables, "pa" and "ta." The researchers collected magnetic data only from the left hemisphere of the brain among the newborns because they cannot sit up and the magnetoencephalography cap was too big to securely fit their heads.
At all three ages the infants showed activation in the temporal part of the brain, Broca's area, that is responsible for listening and understanding speech, showing they were able to detect sound changes for all three stimuli. But the pure perception of sound did not activate the areas of the brain responsible for speaking. However, researchers began seeing some activation in Broca's area when the 6-month-old infants heard the syllables or harmonic chords. By the time the infants were one-year old, the speech stimuli activated Broca's area simultaneously with the auditory areas, indicating "cross-talk" between the area of the brain that hears language and the area that produces language, according to Kuhl.
"We think that early in development babies need to play with sounds, just as they play with their hands. And that helps them map relationships between sounds with the movements of their mouth and tongue," she said. "To master a skill, babies have to play and practice just as they later will in learning how to throw a baseball or ride a bike. Babies form brain connections by listening to themselves and linking what they hear to what they did to cause the sounds. Eventually they will use this skill to mimic speakers in their environments."
This playing with language starts, Kuhl said, when babies begin cooing around 12 weeks of age and begin babbling around seven months of age.
"They are cooing and babbling before they know how to link their mouth and tongue movements. This brain connection between perception and production requires experience," she said.
Subscribe to:
Posts (Atom)