summary: New research shows that newborns can detect complex speech patterns that follow nonadjacent, language-like rules, suggesting that the ability to process such sequences is innate. Researchers used near-infrared spectroscopy to observe newborn brain responses to a series of sounds and found that infants were able to distinguish between correct and incorrect patterns.
This study found that this early ability activates language-related networks, particularly in the left hemisphere, highlighting the basis for future language skills. By six months, these networks become more specialized, showing that early exposure to sound influences brain development.
This finding points to the importance of early auditory experiences and opens up the possibility of music interventions for young children to support language growth. These findings are particularly relevant for babies in less stimulating environments.
important facts
- Newborns can detect nonadjacent sound patterns, a skill essential for language.
- The language processing area of the brain is activated from birth by a series of sounds.
- Early exposure to sounds may help develop language-related brain networks.
sauce: University of Vienna
A team of researchers, including psycholinguist Jutta Müller from the University of Vienna, has discovered that newborn babies can learn complex sequences of sounds that follow language-like rules.
This groundbreaking study provides evidence for years of research that we have an innate ability to recognize dependencies between nonadjacent acoustic signals.
The results of this study were recently published in a prestigious journal. PLOS Biology.
It has long been known that babies can learn sequences of syllables and sounds in direct succession. However, human languages often contain patterns that connect non-adjacent elements.
For example, in the sentence “The tall woman hiding behind the tree calls herself Catwoman,” the subject “tall woman” is the third-person singular verb ending “-” s” is connected.
Research on language development shows that children begin to acquire such rules in their native language by the age of two. However, learning experiments have shown that infants as young as 5 months old can detect rules between nonadjacent elements not only in language but also in nonverbal sounds such as tones.
“Even chimpanzees, our closest relatives, can detect complex acoustic patterns embedded in sounds,” says co-author Simon Townsend of the University of Zurich.
Sound pattern recognition is innate
Many previous studies have suggested that the ability to recognize patterns between nonadjacent sounds is innate, but until now there was no clear evidence.
An international team of researchers showed evidence of this by observing brain activity in newborns and six-month-old infants as they listened to complex sound sequences. In their experiments, newborns a few days old were exposed to sequences in which the first sound was linked to a non-adjacent third sound.
After listening to two different types of sequences for just 6 minutes, the babies were presented with a new sequence that followed the same pattern but had a different pitch. These new sequences were either correct or the pattern contained an error.
Using near-infrared spectroscopy to measure brain activity, the researchers found that the newborn brains were able to distinguish between correct and incorrect alignments.
Sound activates language-related networks in the brain
“The frontal cortex (the area of the brain located just behind the forehead) played an important role in newborns,” explains Yasuyo Minagawa of Keio University in Tokyo.
The strength of the frontal cortex’s response to incorrect sound sequences was associated with activation of a predominantly left hemisphere network that is also essential for language processing.
Interestingly, 6-month-old infants showed activation of this same language-related network when distinguishing between correct and incorrect sequences.
The researchers concluded that complex speech patterns activate these language-related networks from the beginning of life. Over the first six months, these networks become more stable and specialized.
Early learning experiences are key
“Our results show that the brain is capable of responding from day one to complex patterns such as those found in language,” explains Jutta Müller from the Department of Linguistics at the University of Vienna.
“How brain regions are connected during neonatal learning processes suggests that early learning experiences may be important for forming networks that later support the processing of complex acoustic patterns. Masu.”
These insights are key to understanding the role of environmental stimuli in early brain development. This is especially important when stimulation is lacking, inadequate, or poorly processed, such as in premature infants.
The researchers also highlighted that their findings demonstrate how nonverbal acoustic signals, such as the tone sequences used in the study, can activate language-related brain networks.
This opens up exciting possibilities for early intervention programs, for example using musical stimulation to promote language development.
About this neurodevelopmental and auditory neuroscience research news
author: Alexandra Frey
sauce: University of Vienna
contact: Alexandra Frei – University of Vienna
image: Image credited to Neuroscience News
Original research: Open access.
“Functional reorganization of brain regions supporting artificial grammar learning over the first six months of lifeWritten by Simon Townsend et al. PLOS Biology
abstract
Functional reorganization of brain regions supporting artificial grammar learning over the first six months of life
Pre-babbling infants can track nonadjacent dependencies (NADs) in auditory areas. Although this forms an important prerequisite for language acquisition, the neurodevelopmental origins of this ability remain unclear.
We applied functional near-infrared spectroscopy in newborns and 6- to 7-month-old infants to investigate the neural substrates that support NAD learning and detection using sound sequences in an artificial grammar learning paradigm. did.
NAD detection is indicated by activation in the left prefrontal cortex in newborns and in the left marginal gyrus (SMG), superior temporal gyrus (STG), and inferior frontal gyrus in 6- to 7-month-old infants. I did.
Functional connectivity analysis further showed that neonatal activation patterns during the testing phase benefited from a brain network consisting of the prefrontal cortex, left SMG, and STG during the resting and learning phases.
These findings suggest that learning-related functional brain networks in the left hemisphere emerge at birth and may serve as the basis for the involvement of these regions for subsequent NAD detection, providing a neural basis for language acquisition. It suggests something.