Evidence 01

AI Training

Meta's global privacy director confirmed under oath to the Australian Senate that the company has been scraping every public Facebook and Instagram post from Australian adults since 2007 to train AI. Photos of children on adult accounts are included. Australians have no opt-out, unlike Europeans protected by GDPR.

Read the full evidence →

Evidence 02

Facial Recognition

Clearview AI scraped over 30 billion photos from Facebook to build a facial recognition database. The Australian Information Commissioner found it breached the Privacy Act. Children's faces are included. Despite orders to delete Australian data, there is no evidence of compliance.

Read the full evidence →

Evidence 03

Deepfakes

AI-generated explicit images of at least 50 female students at a Melbourne school were created using their publicly accessible social media photos and circulated in June 2024. School photos at another Melbourne school were manipulated in 2025. The pipeline from school Facebook post to deepfake is disturbingly short.

Read the full evidence →

Evidence 04

Training Datasets

Human Rights Watch found 362 identifiable Australian children in the LAION-5B dataset, from a sample of less than 0.0001%. Sources included school uploads. Children's full names, ages, and school locations were exposed. Once scraped into AI models, the data cannot be removed.

Read the full evidence →

Evidence 05

No Consent

The NSW "Permission to Publish" form offers parents a binary choice: consent to all public publishing or none. No option to say "yes to newsletters, no to Facebook." No mention of AI training, facial recognition, or deepfakes. The consent is from a different era.

Read the full evidence →

Evidence 06

The First Step

Moving to a private Group stops third-party scraping, but not Meta. Getting off Facebook stops Meta, but not all platforms. There is no single fix. But there is a clear first step, and an honest picture of what each level of action actually protects against.

Read the full evidence →

The System

Facebook in Australian Schools

The laws, the policies, the consent forms, and the process. The Privacy Act, the PPIP Act, the NSW DoE Social Media Policy, the consent framework, who takes the photos, who posts them. And the critical detail: NSW DoE policy requires school accounts to be public.

Read the full breakdown →

Risk summary

What a public school Facebook Page exposes children to, right now.

Risk Status Evidence
Meta AI training on children's photos Confirmed Admitted under oath, Australian Senate, Sept 2024
Facial recognition scraping Confirmed 30B+ images scraped. OAIC found breach of Privacy Act
AI training dataset inclusion Confirmed 362 AU children found in <0.0001% sample (HRW)
Deepfake generation Active threat 50+ Melbourne schoolgirls targeted, June 2024
Search engine indexing Confirmed Standard for all public Facebook Pages
Consent form inadequacy Systemic Binary choice. No AI disclosure. Stale consent.
Regulatory non-compliance risk Growing Children's Privacy Code due December 2026

Ready to take the first step?

We've prepared a motion template, step-by-step guide, and FAQ for your P&C meeting.

Take action