AI is becoming inevitable. What does that mean for ECEC?
The Sector > Quality > In The Field > AI is becoming inevitable. What does that mean for ECEC?

AI is becoming inevitable. What does that mean for ECEC?

by Freya Lucas

November 07, 2023

The advent of artificial intelligence (AI) has been a game changer for many sectors and industries, including early childhood education and care (ECEC). 

 

In an ECEC space products and services which use AI to write observations and reports, complete Certificate III and Diploma assignments, and respond to parent emails and enquiries are readily available. 

 

In the piece below we explore some of the queries and questions which ECEC professionals have about AI, and how it is, or could be, utilised in ECEC. 

 

What is AI? 

 

Coined in the 1950s, artificial intelligence refers to the simulation of human intelligence by machines. It covers an ever-changing set of capabilities as new technologies are developed. Technologies that come under the umbrella of AI include machine learning and deep learning.

 

Essentially, AI is the simulation of human intelligence processes by machines, especially computer systems. Specific applications of AI include expert systems, natural language processing, speech recognition and machine vision.

 

AI requires a foundation of specialised hardware and software for writing and training machine learning algorithms. 

 

In general, AI systems work by ingesting large amounts of labeled training data, analysing the data for correlations and patterns, and using these patterns to make predictions about future states. 

 

To use an ECEC example, a chatbot might be fed examples of parent enquiries, like emails, text messages and webforms, to ‘learn’ about what parents typically want or ask for when approaching a new centre. 

 

“Don’t think of your AI chatbot as an omniscient artificial brain, but as a gifted graduate student assigned to be your personal work assistant,”  Professor Kai Riemer and Director of Sydney Executive Plus, Sandra Peter warn

 

“Like an eager grad student, they work tirelessly and mostly competently on assigned tasks. However, they are also a little bit cocky. Always overconfident, they might take risky shortcuts and provide answers that sound good but lack any factual grounding.”

 

What are some of the possible benefits of AI for ECEC?

 

For many, the appeal of AI in an ECEC context is in reducing administrative burdens and freeing up time by streamlining everyday tasks like scheduling, data analysis, and some aspects of communication with parents and families, like summative reports and learning stories. 

 

On a broader, provider level scale, AI has the power and capability to trawl through large sets of data, and identify patterns in children’s behaviour, serious incidents, rostering bottlenecks, staffing patterns and more, saving time and human resources, as well as improving occupational health and safety outcomes. 

 

What are some of the possible concerns with AI? 

 

AI is in its infancy for many of its iterations, and in the ECEC sector, there are concerns. Speaking with CELA,  Dr Kate Highfield, a teacher and researcher known for her exploration of young children’s engagement with technology, play, and learning, said AI tools “do have to be used with caution, as the responses these tools generate are only as good as the information we input and then we also need to consider the data we are uploading, the privacy of this work of our children and of course the requests we are making.” 

 

Dr Highfield also expressed her concern about AI tools compromising relationships. 

 

“Within all our work, relationships come first,” she said. 

 

“The risk here is that we allow the technology to take precedence over relationships with our children and their families. AI can’t yet build relationships, so it cannot fully support our work that is relational.” 

 

Who’s leading the AI charge? 

 

Some reports suggest that one in three Australian employees are using AI already, but more than 70 per cent of businesses are not yet on board. This means there is a gap between what employees are doing, and what employers have policies, procedures and practices for, which may present a challenge, particularly in a highly regulated sector such as ECEC where children’s safety is paramount. 

 

If educators and other ECEC professionals are using AI tools without endorsement, guidance, policy or procedures, there could be ethical, reputational and security concerns for services and providers. 

 

What’s next? 

 

The possibilities of AI are endless, particularly in an educational setting. 

 

A UK school has appointed an AI chatbot as a “principal headteacher” to support its headmaster, advising staff on issues such as policy writing, how to help children with additional needs, and other aspects of leadership. 

 

AI toys and apps can already respond to the needs of children, helping them to improve fine motor skills and cognitive ability and promoting gradual improvement. 

 

Language apps help children to acquire language and create interactive learning experiences where children are taken through a variety of experiences designed to improve their speaking, listening, reading and writing. 

 

Considering AI? 

 

For those services and providers who would like to integrate AI tools, Early Learning Management offers the following advice:

 

  • Training and Upskilling: Educators and administrators should be trained to understand and utilise AI technologies effectively.
  • Data Security: Given the sensitive nature of information in ECEC settings, robust cybersecurity measures are essential.
  • Parental Involvement: Parents should be kept in the loop about the use of AI in ECEC services, ensuring transparency and trust.
  • Government Guidelines: Regulatory bodies should provide frameworks for the ethical use of AI in ECEC.
  • Pilot Programs: Before full-scale implementation, pilot programs can help understand the practical challenges and benefits of using AI.

 

Professor Riemer and Ms Peter’s thoughts can be found here. Dr Highfield’s remarks were shared with CELA in a piece called AI and documentation: What are the ethical considerations?, and Early Learning Management published their perspective on AI here

 

Early Childhood Australia has developed a statement on young children and digital technologies, which may support thinking about AI when considered in line with the Code of Ethics

Download The Sector's new App!

ECEC news, jobs, events and more anytime, anywhere.

Download App on Apple App Store Button Download App on Google Play Store Button
PRINT