Born musical: new research shows that musicality is present from birth
Neuroscientists from York University have established that the capacity to hear the difference between minor and major notes in music is not taught, but instead is something 30 per cent of babies are born with.
Minor and major notes are significant building blocks in music, as they are considered to be what gives music emotion. Traditionally it has been held that music in a major key is judged as ‘happy’ while minor key music is heard as ‘sad’, however there are exceptions to this rule. For example, In 2013, Rolling Stone magazine polled readers, asking them to choose the saddest songs of all time.
Of the top 10 saddest songs chosen, seven are in major keys, dispelling the myth that songs in minor are “sad” and songs in major are “happy”. While emotions are subjective, and dependent on a number of contexts, there is a wide range of resources supporting the major/minor/emotion connection, making the ability to distinguish between minor and major one aspect of reading the intention of the composer.
The research is interesting, because it has implications for babies developing appreciation of the emotional content of music.
Having previously established that approximately 30 per cent of adults can discriminate this difference but 70 per cent cannot, irrespective of musical training, the researchers wanted to learn more about the capacity of infants to distinguish between the notes.
Researchers found that six-month-old infants show exactly the same breakdown as adults: approximately 30 per cent of them could discriminate the difference and 70 per cent could not, by using a unique method that uses eye movements and a visual stimulus.
Trials were conducted with 30 six-month-old infants in which they heard a tone-scramble, a series of notes whose quality (major vs. minor) signalled the location (right vs. left) where a subsequent picture (target) would appear. The babies were tasked with determining which side to look when they heard a major or a minor sound.
Once they heard a series of notes, a picture would either appear on the right or the left depending on whether it was a major or minor tone scramble. In a second experiment, tone-scrambles did not reliably predict the location of subsequent pictures.
“What we measured over time was how the infants learned the association between which tone they heard and where the picture is going to show up. If they can tell the difference in the tone, over time, when they hear the major notes for example, they’ll make an eye movement to the location for the picture even before the picture appears because they can predict this. This is what we are measuring,” Associate Professor Scott Adler said.
The results may also have implications for language development, which relies on some of the same mechanisms and auditory content as music, Associate Professor Adler continued.
“There is a connection between music, music processing and mathematical abilities, as well as language, so whether these things connect up to those abilities is unknown. However, when people talk to babies they change the intonation of their voice and the pitch of their voice so they’re changing from major to minor. That is actually an important component for babies to learn language. If you don’t have the capacity it might affect that ability in learning language.”
The study, published in the Journal of the Acoustical Society of America, may be viewed here.
Momentum for educator pay, conditions and support change reinforced in Productivity Commission Report
by Jason Roberts
23-year long study of neuroimaging shows effects of tech on children’s brains
by Freya Lucas
Poverty is linked to poorer brain development – but reading can help counteract it
by Freya Lucas