Intersensory processing and learning new words
The Sector > Research > Intersensory processing is an essential pathway to learning new words, researchers find

Intersensory processing is an essential pathway to learning new words, researchers find

by Freya Lucas

June 27, 2023

Being able to match a face to a voice in early infancy determines later language development, researchers from Florida International University have found.

 

Matching the sight and sound of speech (or putting a face to a voice) is known as intersensory processing, and is an essential pathway to learning new words. The degree to which a child can do this at six months of age is a powerful predictor of language and vocabulary outcomes at 18 months, two years of age and three years of age, researchers said. 

 

“Adults are highly skilled at this, but infants must learn to relate what they see with what they hear. It’s a tremendous job and they do it very early in their development,” said lead author Elizabeth V. Edgar. 

 

“Our findings show that intersensory processing has its own independent contribution to language, over and above other established predictors, including parent language input and socioeconomic status.”

 

Ms Edgar and her team tested intersensory processing speed and accuracy in 103 infants between the ages of three months and three years old, using the Intersensory Processing Efficiency Protocol (IPEP). 

 

Designed to present distraction or simulate the “noisiness” of picking out a speaker from a crowd, the IPEP presents several short video trials. Each trial depicts six faces of women displayed in separate boxes on the screen at once. All the women appear to be speaking.

 

However, the soundtrack that matches only one of the women speaking is heard on each trial. With an eye tracker that follows pupil movement, the researchers could measure whether the babies made the match, as well as how long they watched the matching face and voice.

 

The data was then compared with language outcomes at different stages of development – such as how many unique words a child had. The results showed that infants who looked longer at the correct speaker were later found to have better language outcomes at 18 months, two years, and three years of age. 

 

For parents or caretakers, Ms Edgar pointed out this research serves as a reminder that babies rely on coordinating what they see with what they hear to learn language.

 

“That means it is helpful to gesture toward what you’re talking about or move an object around while saying its name,” she said. 

 

“It’s the object-sound synchrony that helps show that this word belongs with this thing. As we’re seeing in our studies, this is very important in early development and lays the groundwork for more complex language skills later on.”

 

Access the findings in full here.

Download The Sector's new App!

ECEC news, jobs, events and more anytime, anywhere.

Download App on Apple App Store Button Download App on Google Play Store Button
PRINT