Using artificial intelligence – a call for caution
The Office of the Australian Information Commissioner (OAIC) recently released new guidance to help organisations comply with privacy obligations under the Privacy Act 1988 (Privacy Act) when using artificial intelligence (AI).
The guidance highlights several ways organisations, including early childhood education and care (ECEC) providers, could unintentionally breach the Australian Privacy Principles (APPs) when using AI.
Some of the key risks for providers to consider include:
- Failure to be open and transparent
Failing to understand the privacy risks associated with AI can lead to breach of the APPs, which requires personal information to be managed openly and transparently (APP 1).
ECEC providers should consider whether their privacy policies need to be updated, or if other steps are necessary to promote transparency in light of any new uses of personal information.
If an individual finds it difficult to understand how AI products are used by the ECEC provider and where the data used by the products will be transferred to, that provider could be at risk of breaching the APPs. Providers should ensure they inform individuals on how their personal information is being used and where it might be shared.
An ECEC provider should also consider conducting a privacy impact assessment before using AI or deploying an AI product to identify and assess potential risks and develop an action plan to address these risks.
- Collecting personal information from sources other than the individual
AI models often generate or infer personal information without collecting it directly from the individual concerned.
APP 3 states that unless certain narrow exceptions apply (e.g. if it is unreasonable or impracticable to do so), personal information should be collected directly from the individual.
Any use of AI products to collect personal information about a child or a child’s family should be thoroughly vetted from both a legal and risk perspective before deployment to avoid breaching this APP.
- Using personal information for a secondary purpose
Under APP 6, data collected for one purpose should not be used for another unless the individual has consented to the use, or another exception under the APP applies.
An example of where breach of this APP can occur is when ECEC providers engage third party AI suppliers, who then use the personal information provided by the provider, to train its AI products. ECEC providers should take care to ensure that the terms of any engagement with a third party AI supplier prevent this secondary use of personal information.
- Not protecting the integrity of personal information
Personal information can quickly become outdated. AI models may retain outdated personal information, making it challenging to ensure data accuracy, as required by APP 10.
Once data has been fed into an AI model, it could be challenging (if not impossible) for ECEC providers to end the use of that data by the model and remove those data records.
- Sending data overseas
Overseas processing of personal information is regulated under the APPs and under the Privacy Act generally.
Again, it is strongly desirable that ECEC providers take care when engaging third party AI suppliers and ensure that all transfers of data overseas are assessed for their lawfulness and appropriateness.
Practical steps for ECEC providers
Before using children’s personal information to train AI models, an early learning centre should ensure that all applicable privacy laws have been – and can continue to be – complied with.
ECEC providers should not only consider internal practices, but also their dealings with third-party contractors. It is essential to review standard contract terms to ensure they adequately address privacy risks associated with AI use.
A clear AI policy should also be in place and staff should receive training on how to comply with the policy and the law.
Understanding and managing these risks will help ECEC providers maintain trust with families and meet their legal obligations.
Jean Lukin is a Senior Associate at national law firm Holding Redlich, specialising in Information and Communication Technology contracting and practices broadly across areas of law pertinent to government organisations.
Nastassya Naude is an Associate at national law firm Holding Redlich, experienced in advising enterprise clients and startups, from a broad range of sectors, across a range of complex Information and Communication Technology related matters.
Popular
Jobs News
Quality
Workforce
Practicum Exchange Living Allowance applications now open
2025-01-15 11:10:23
by Freya Lucas
Jobs News
Quality
Workforce
Supporting our Educators to be job ready: Setting them up for success
2025-01-21 08:51:58
by Contributed Content
Provider
Quality
Workforce
Tailor made tour sees New Zealand play host to global ECEC delegates
2025-01-23 06:51:20
by Freya Lucas