Moving to a private Group is the first step, not the last. Here is what each level of action actually protects against, and what it does not.
There is no simple fix. Photos already posted on public Facebook Pages have already been scraped. AI models trained on those photos cannot unlearn them. Facial recognition databases that captured those faces will hold them indefinitely.
What is done is done. But we can stop it getting worse, together. Each step we take reduces our children's exposure going forward.
The minimum first step. What we are asking schools to do right now.
| Threat | Protected? |
|---|---|
| Third-party scrapers (Clearview AI, LAION, etc.) | Yes. Content not publicly accessible. |
| Search engine indexing | Yes. Private Group content not indexed. |
| Random public access to children's photos | Yes. Membership approval required. |
| Meta's own AI training | Unclear. Meta says it does not currently use private Group content for AI training, but its terms grant a broad licence over all content on its platforms. Those terms can change at any time. |
| Members saving or resharing photos | No. Any member can still screenshot, save, and share. |
| Historical scraping (already happened) | No. Already in databases and models. |
A private Group immediately eliminates the broadest attack surface: unrestricted public access. It stops indiscriminate, automated, mass-scale scraping by third parties. Right now, anyone on the internet, with no connection to the school, can access and download every photo of every child. A private Group closes that door.
The next step. Removes Meta from the equation.
| Threat | Protected? |
|---|---|
| Third-party scrapers | Yes |
| Search engine indexing | Yes |
| Random public access | Yes |
| Meta's own AI training | Yes. No content on Meta's platform. |
| Other platform AI training | Depends on the alternative platform's terms and policies. |
| Historical scraping | No. Already in databases and models. |
Moving to a purpose-built school communication platform removes Meta entirely. No data on Meta's platform means no Meta AI training.
Alternatives already used by Australian schools:
The right alternative depends on the school community. The key question: does the school control the platform, or does a company that mines data for AI?
The most protective approach. Not the first step, but where the conversation leads.
| Threat | Protected? |
|---|---|
| All scraping and AI training | Yes. No identifiable images online to scrape. |
| Facial recognition capture | Yes |
| Deepfake source material | Yes |
| Historical scraping | No. Already in databases and models. |
This is the most protective position. It means schools share photos of activities and events without identifiable faces, or share them only through non-digital channels (printed newsletters, physical displays in the school).
We are not advocating for this as a first step. Parents value seeing their children's school life. But every digital photo of a child, on any platform, carries some risk. The question is how much risk a school community is comfortable accepting.
This is the hardest part. Photos already posted on public Facebook Pages may have already been:
Deleting those posts will not remove data from systems that have already processed it. But deleting them still helps:
Cleaning up historical public posts is a necessary companion to any move forward.
The direction is clear: stronger protections for children's data, less tolerance for unconsented use of personal information, greater accountability for platforms that scrape and train on that data. Schools that act now are ahead of where the law is going. Schools that wait are accumulating risk.
We are not asking schools to stop communicating with us. We are not asking them to abandon social media overnight. We love seeing our kids at school. We are asking for one specific, practical, free change: move from a public Facebook Page to a private Facebook Group.
This is Level 1. It will not solve everything, and we owe each other honesty about that. But it immediately stops the worst of it: the indiscriminate public exposure of our children's photos to every scraper, every AI system, and every stranger on the internet.
Then we keep going. Together.