Is reading facial expressions more of a challenge for “the touch screen” generation?

by Freya Lucas

May 14, 2020

A new UCLA psychology study has explored a question which many early childhood education and care (ECEC) professionals may have pondered – are today’s children, who grow up with mobile technology from birth, worse at reading emotions and picking up cues from people’s faces than children who didn’t grow up with tablets and smartphones? 

 

Infancy and early childhood are critical phases in a child’s life, during which children learn to interpret important non-verbal cues such as facial expressions, tone of voice and gestures. Researchers wanted to learn more about the impact of “the ubiquitous use of tablets and other devices” in toddlers and their caregivers, and if this was impacting on the ability of younger children to understand these cues. 

 

To find out more about the impact of these devices, the researchers worked with children in sixth grade from a Southern California public school, first in 2012, and then again in 2017 testing each group’s ability to correctly identify emotions in photographs and videos

 

Most children from the sixth-grade class of 2012 were born in 2001, while the first iPhone came out in 2007, for example, and the first iPad in 2010 — a time when the sixth graders from the 2017 class were infants and toddlers.

 

The children in the 2017 cohort scored 40 per cent higher than the children in the 2012 cohort when it came to correctly identifying emotions in photographs and made significantly fewer errors than the 2012 students. 

 

In addition, the 2017 students were better at identifying the emotions in a series of videos, but only slightly better, a difference the researchers said is not statistically significant. The psychologists did not look at face-to-face communication.

 

Lead author, adjunct assistant professor Yalda T. Uhls said she hopes the findings will give ECEC professionals and parents “some peace of mind” that children are able to read social cues in photos, particularly given the prevalence of screen based communication as a function of modern life. 

 

A study involving 500 participants conducted in 2017 found that nearly half of the children involved regularly used a social media app or website between the ages of six and twelve years, with 29 per cent of those aged between six and eight years of age reporting their use of Snapchat

 

“Perhaps our 2017 participants had more opportunities to see, communicate and learn nonverbal emotion expressed in photographs of faces than those from 2012 because of the time spent taking and reviewing photos of themselves and others,” Ms Uhls speculated.

 

These findings, however, do not discount the value of face to face communication, with the adjunct assistant professor strongly recommending that families have face-to-face conversations around the dinner table and at other times of the day. She also encouraged parents and caregivers to put their devices away when talking with other people, especially their children.

 

While the 2017 participants improved in their ability to read emotional cues in photos, Ms Uhls said she is not certain whether this ability transfers to their ability to assess emotions in person, however she believes it may.

 

“With so many children on screens so frequently” she added “it is important to know that good things can come from their interactions with photos.” 

 

“I would expect with the recent increase in video communication, they may be now learning these cues from video chat too.”

 

To progress the findings, she said that by evaluating the nuances of screen time, researchers could learn which practices have educational value and which do not.

 

“Technology is always evolving, and I expect that researchers will seek to understand how increased exposure to pictures, videos, live chats, games, virtual reality and other emerging platforms for communication impact our youth,” she continued.

 

The study is published in the journal Cyberpsychology, Behavior, and Social Networking and may be accessed here

PRINT