This was confirmed under oath to the Australian Senate. Every public Facebook post since 2007. No opt-out for any of us.
On 11 September 2024, Meta's global privacy policy director Melinda Claybaugh appeared before the Australian Senate Select Committee on Adopting Artificial Intelligence. Under questioning from Greens Senator David Shoebridge, she made a series of admissions that affect every family in Australia.
Meta has been scraping all public Facebook and Instagram posts from Australian adults since 2007 to train its AI models, including Llama and Meta AI. That's every public photo, every public status update, every public comment. Nearly two decades of content.
"Meta has just decided that you will scrape all of the photos and all of the text from every public post on Instagram or Facebook that Australians have shared since 2007, unless there was a conscious decision to set them on private." Senator David Shoebridge, Australian Senate Select Committee, 11 September 2024
When Senator Shoebridge asked Claybaugh directly to confirm Meta had been scraping posts going back to 2007, she responded with a single word: "Correct."
Claybaugh confirmed that while data from accounts of users under 18 is excluded from AI training, photos of children posted on adult accounts are not. If an adult, or a school, posts a photo of a child on a public Facebook Page, that photo is scraped for AI training.
Labor Senator Tony Sheldon, the committee chair, pressed this point. Parents don't read privacy policies. They would have no idea that photos of their "13-year-old daughter" or "nine-year-old son" were being harvested to train AI systems. Claybaugh did not dispute it.
Our school Facebook Pages are organisational accounts run by adults. Every photo of our children posted on those public Pages is available to Meta for AI training. This is not theoretical. This is not a risk. This is happening right now, confirmed by Meta's own representative under oath. This was not disclosed when schools set up their Facebook Pages. It was not part of any consent process.
When asked why Australians can't opt out of having their data used for AI training, as Europeans can under GDPR, Claybaugh was blunt:
"The specific option that we're offering in Europe is in response to a very specific legal framework." Melinda Claybaugh, Meta Global Privacy Policy Director, Australian Senate, 11 September 2024
In other words: Meta will only offer privacy protections when the law forces it to. Australia's current privacy laws don't force it. So Australians, and Australian children, get no protection.
Senator Shoebridge summarised the position plainly:
"Meta made it clear today that if Australia had these same laws, Australians' data would also have been protected." Senator David Shoebridge, 11 September 2024
When pressed on why Meta scrapes public data at this scale, Claybaugh argued it prevents bias and ensures representative training data for AI development. Our children's faces are collateral damage in Meta's AI ambitions, and Meta considers that an acceptable trade-off. We don't.
When Meta says it uses public posts to "train AI," it means feeding the content, including images of children, into machine learning systems that learn to recognise, generate, and manipulate visual content. Once a photo has been used for training:
A school could delete every photo it has ever posted on Facebook tomorrow. It would make no difference to the AI models that have already been trained on those photos. The damage, if you want to call it that, is done the moment the photo is scraped. The only way to prevent it is to never make the photo public in the first place.
Meta has been operating in Australia since Facebook launched locally in the mid-2000s. Facebook alone has roughly 19 million monthly active Australian users. Every public post from every one of those users, nearly two decades of content, has been fed into AI training systems.
There are 9,653 schools in Australia (ABS, 2024). The vast majority have a Facebook presence, and most of those are public Pages. Each school posts dozens to hundreds of photos of students per year. The aggregate volume of identifiable children's images on public school Facebook Pages is enormous, and every one of them is available to Meta's AI.
European users were given an opt-out mechanism for AI training because the General Data Protection Regulation (GDPR) required it. In September 2024, Claybaugh told the Senate that Meta had "paused launching our AI products in Europe while there is a lack of certainty." Meta eventually resumed training on EU user data in May 2025, but only after giving every European user a clear opt-out form and receiving regulatory clearance from the Irish Data Protection Commission.
Australians got none of that. No notification. No opt-out form. No regulatory clearance. Meta scrapes freely here because the law allows it.
The OAIC's Children's Online Privacy Code, due by 10 December 2026, may change this. The exposure draft was released on 31 March 2026 and is open for public consultation until 5 June 2026. But every day until the Code takes effect, public school Facebook Pages are feeding children's photos into AI systems with no legal barrier and no parental opt-out.
The September 2024 hearing triggered a chain of regulatory activity. None of it has yet stopped Meta from scraping Australian children's photos.
As of April 2026, Australians still have no opt-out. The second tranche of Privacy Act reforms, expected to introduce a "fair and reasonable" test for data use, has no confirmed introduction date.
A private Facebook Group removes content from public access, which prevents scraping of public posts. Meta says it does not currently use private messages or private group content for AI training. But Meta's terms of service grant it a broad licence over all content posted on its platforms, and those terms can change at any time. The only way to guarantee Meta cannot use children's photos for AI training is to stop posting children's photos on Meta's platforms entirely. A private Group is the minimum first step, not the solution.