New AI safety standards endorsed by ATSE: What does that mean for ECEC?
The Sector > Practice > New AI safety standards endorsed by ATSE: What does that mean for ECEC?

New AI safety standards endorsed by ATSE: What does that mean for ECEC?

by Freya Lucas

October 01, 2024

The Australian Academy of Technological Sciences and Engineering (ATSE) has welcomed the release of the voluntary AI Safety Standards, which provide guidance to AI developers and users.

 

ATSE CEO Kylie Walker said Australia has the potential to lead the world in responsible AI. 

 

“This is Australia’s AI moment. Ultimately, these proposals will help Australia lead in both technological and regulatory innovation in AI, setting a global standard for responsible and effective AI development and deployment,” Ms Walker said.

 

The endorsement of the Safety Standards is an important note to the early childhood education and care (ECEC) sector to ask vendors who are using AI in the development of their products, or who are selling solutions which are driven by AI, about their adherence to the standards, and about their awareness of the shifting measures of safety when it comes to sensitive information about children, and the use and storage of their images. 

 

ATSE supports the guardrails and standards, and the principles proposed by the government for determining whether an AI system is high-risk.  The underpinning regulatory mechanism for determining which AI systems are deemed high-risk, and subject to the proposed mandatory guardrails, is still undergoing consultation. 

 

This mechanism, the Academy warned, will need to be clear and accountable to ensure that high-risk systems “don’t slip through the gaps.”

 

“Greater adoption of AI could see Australia’s economy increase by $200 billion annually, but it is is critical that robust measures are rapidly implemented to safeguard these areas and position Australia at the forefront of AI development,” Ms Walker continued.

 

“Investing further in local AI innovations will simultaneously create new AI industries and jobs here in Australia and reduce our reliance on internationally developed and maintained systems.”

 

By developing and refining AI solutions onshore, she continued, there is greater ability to regulate AI development in line with Australian community values and expectations.

 

Learn more about the voluntary safety standards and how they apply to ECEC here. 

 

Download The Sector's new App!

ECEC news, jobs, events and more anytime, anywhere.

Download App on Apple App Store Button Download App on Google Play Store Button
PRINT