Virtual reality used for ECEC at University of Canterbury
The Sector > Quality > Professional development > University of Canterbury explores virtual reality in early childhood teacher training

University of Canterbury explores virtual reality in early childhood teacher training

by Freya Lucas

June 24, 2024

A new University of Canterbury (UC) study is developing critical training opportunities for early childhood education student teachers who are unable to get access to working with infants aged birth to six months of age during their training.


While many prospective educators are passionate about working effectively with infants (0-6 months), not having ready access to real-life infants as part of their qualification can be problematic, which led the UC team to create specialised training environments through virtual reality (VR).


Professor Jayne White initiated the study, reaching out to HIT Lab NZ to gauge their interest in a collaboration after identifying an issue for student teachers who did not have ready access to real-life infants as part of their qualification.


The proposition quickly caught the attention of Associate Professor Heide Lukosch, Head of HIT Lab NZ’s Applied Immersive Game Initiative (AIGI), which aims to accelerate research and public use of immersive gaming applications with a view to improving personal, social, educational and health-related outcomes.


Professor White then led the two faculties, with funding from the University’s Child Well-being Research Institute, in the first trial of a VR prototype.


“VR has proven itself to be an effective strategy for learning practical skills across many disciplines, but its possibilities within education are yet to be fully explored,” Professor White said. “We’re excited by the potential of the tool, and the opportunities we’ve identified for future development.”


“Professor White proposed the idea of applying VR in early childhood education, because while infants are capable of expressing themselves from birth, it is sometimes difficult for non-familial adults to understand their verbal and non-verbal cues.” Associate Professor Lukosch added.


“What we can do through our research, is provide the support required to understand the cues or signals they produce to better identify and rehearse a grammar that will respond effectively to their needs.”


“The development of a VR tool of this kind would be enormously beneficial when you consider the wide-ranging situations in everyday life in which more than verbal communication is needed.” 


Associate Professor Lukosch and Professor White now co-lead the study which, informed by the Mātauranga Māori concept of whanaungatanga under the guidance of UC Senior Lecturer, Dr Ngaroma Williams, will identify key elements needed to support relational skills for adults working with infants, and translating them into novel training environments.


“We’re really intrigued by the opportunity and the power of virtual environments to help people in situations that would be otherwise hard to access, or potentially dangerous,” Associate Professor Lukosch explained.


“In this case, we aim to create a virtual, immersive environment that enables the user to really feel present and accountable for their actions in certain situations, with opportunities to reflect on these.”


A pair of gloves incorporating haptic technology will form an invaluable element to these virtual environments and encounters with infants. A mechanical system within the gloves will simulate the resistance someone would realistically feel as they carefully handled an infant, whether it be to change, soothe, feed, or entertain them.


One of the early VR training prototypes tasked users with the responsibility for establishing a positive interaction, interpreting, and responding to non-verbal cues concerning virtual infant wants or preferences.


Preferences or personas are something that can be programmed into a virtual baby avatar, and as the user decides what kind of actions they will take in their interaction with the virtual baby, the research will enable us to develop an intelligent system that will decide how their ‘baby’ will respond.


“For example, we could programme a preference for red and when the user picks up a red toy, then the baby would smile. If they choose a yellow toy or similar, then it would start crying,” Associate Professor Lukosch explained.


While it’s not artificial intelligence, as the research component requires the team to have control over the system for user studies, there may be opportunities for this emerging technology moving forward.


“In these training scenarios, it’s very obvious that there’s a chance to integrate artificial intelligence later, to make the learning environments particularly adaptive.”


The UC team are also working closely with Professor Tony Walls and Dr Niki Newman, lead of the Christchurch-based University of Otago Simulation Centre, who are informing the healthcare elements of the study.


As the study progresses towards commercialisation over the next three years, the team hopes the general design principles for interaction derived from the study can be transferred to other validated training environment domains.

Download The Sector's new App!

ECEC news, jobs, events and more anytime, anywhere.

Download App on Apple App Store Button Download App on Google Play Store Button