Meta Deliberately Collects Data From Millions of Kids Under Age 13 — But Doesn’t Want Public to Know

515

Facebook’s parent company, Meta, knowingly allows millions of children under age 13 to use Instagram, but “zealously” hides that fact from the public, according to a newly unredacted legal complaint filed against the company and reported by The New York Times.

Meta Deliberately Collects Data From Millions of Kids Under Age 13 — But Doesn’t Want Public to Know  child

Facebook’s parent company, Meta, knowingly allows millions of children under age 13 to use Instagram, but “zealously” hides that fact from the public, according to a newly unredacted legal complaint filed against the company and reported by The New York Times.

Although Meta received more than 1 million reports of underage users, it disabled only “a fraction” of the accounts and instead “routinely continued to collect” children’s personal information — including locations and email addresses — without parental consent, the unredacted document alleges.

The company also purposefully uses its technology “to entice, engage, and ultimately ensnare youth and teens” and publicly misrepresents its platforms as safe for youth, while concealing its own internal research showing that users experienced harms on its platforms at high rates.

These practices violated the federal children’s privacy law and California’s false advertising and unfair competition laws, according to the complaint.

“Within the company, Meta’s actual knowledge that millions of Instagram users are under the age of 13 is an open secret that is routinely documented, rigorously analyzed and confirmed,” the complaint noted.

The charges are part of a federal lawsuit filed in October by a bipartisan coalition of 33 attorneys general against Meta Platforms Inc. in the U.S. District Court for the Northern District of California. The original filing included evidence that remained conditionally under seal as part of the investigation.

The mostly unredacted complaint, publicly released Monday, provides additional information, which was blacked out in the original filing, about Meta’s alleged misconduct.

Using text from internal employee emails and chats, phone call transcripts, company presentations and passages from internal reports, the complaint makes the case that Meta saw great value in children and teens’ time and data and intentionally targeted them to grow company profits.

“Meta knows that what it is doing is bad for kids — period. Thanks to our unredacted federal complaint, it is now there in black and white, and it is damning,” said California Attorney General Rob Bonta in a press release. “We will continue to vigorously prosecute this matter.”

Under the Children’s Online Privacy Act (COPPA), it is illegal to collect data on children under 13 and companies are subject to fines of up to $50,120 per violation.

The lawsuit seeks injunctive relief along with civil penalties and other financial restitution. If the lawsuit succeeds, Meta could face hundreds of millions of dollars or more in penalties.

When the complaint was first filed in October, Meta put out a statement saying it was “disappointed,” with the lawsuit. It added that the company shared the attorneys general’s commitment to provide teens with “safe, positive experiences online, and have already introduced over 30 tools to support teens and their families.”

On Saturday, a Meta spokesperson said “The [unredacted] complaint mischaracterizes our work using selective quotes and cherry-picked documents,” the Times reported.

Child users on Meta platforms

In congressional testimony by Meta executives in 2021, the company downplayed its knowledge of under-13 users by citing its terms of service, which explicitly state that users under 13 are not permitted and users are asked to self-report that they are 13 years of age or older.

The complaint, however, includes internal Meta documents containing detailed reporting of the social media giant’s “penetration into 11- and 12-year-old demographic cohorts,” reports to Zuckerberg that there are 4 million under-13 users on the sites and other evidence that Meta was well aware of the underage users.

The charts show the company knew there was “daily and continuously increasing use” of Instagram by under-13 users and that its use of the term market “penetration” signals it both desires and intends such use.

Meta also was informed of individual underage Instagram user accounts through complaints filed with the company, sometimes by parents. But the company has a policy of “automatically ignoring certain external reports” the prosecutors allege, if the account does not contain a user bio or photo. It also continues to collect their data.

In 2021 alone, the company received “over 402,000 reports of under-13 users on Instagram via its underage reporting webform and in-app underage reporting process,” but it disabled fewer than 164,000 of those accounts.

Meta collects personal data on all Facebook and Instagram users. And for those users under age 13, it does so without parental consent, in violation of COPPA.

Under COPPA, online sites and services “directed to children under 13” must obtain parental consent before collecting or using personal information from a child.

Growing ‘time spent’ on the platform 

Former Meta Chief Operating Officer Sheryl Sandberg and co-founder Mark Zuckerberg denied publicly, in press statements and congressional hearings respectively, that Meta designed its platforms to be addictive in order to maximize “time spent.”

But the longer a user stays on a platform such as Facebook or Instagram, the more personal data a platform can collect, and the more effective targeted ads can be, according to the complaint.

This is dangerous, according to Dr. Victoria L. Dunckley, integrative psychiatrist and author of “Reset Your Child’s Brain: A Four-Week Plan to End Meltdowns, Raise Grades, and Boost Social Skills by Reversing the Effects of Electronic Screen-Time,” who told The Defender that screen time, particularly on social media, harms children in a number of different ways and they don’t necessarily have the capacity to resist.

She said:

“All screen time stresses the nervous system and is ultimately depressogenic, by altering brain chemistry, reward pathways, the body clock and stress hormones. When you consider this, and add on the layers that social media imparts — social comparison, body image issues, self-destructive behaviors, compulsive use, and so on — and then you add on to that the fact that children’s brains are still developing, it is ludicrous to suggest that children should be using social media, much less using it without parent permission.

“The acuity level we’re seeing in young people today is sky high, like I’ve never seen in 20 years. And kids aren’t getting better, even those with resources. A major part of this is the irresistible pull of social media — even when they see it as a problem they cannot stop. We, the adults, have to help them stop. Make them stop.”

Internal communications snippets in the complaint show that Meta was aware of these harms, yet the company explicitly focused on “driving time spent” among kids and teens by developing tools, like its “Recommendation Algorithm,” that explicitly exploit such effects, rather than avoid them.

For example, documents show Meta knew its Recommendation Algorithms trigger intermittent dopamine releases in young users, “whose developing brains are especially susceptible to such tactics.” And it knew that this could “contribute to problems” for young users.

Yet, it continued not only to use them, it also used data harvested from users to target user engagement on an individual level via its Recommendation Algorithms, ”making continued engagement even more difficult for young users to resist,” prosecutors allege.

Meta’s knowledge of how platforms affect children and teens was first made public in 2021 when whistleblower Frances Haugen shared internal documents with The Wall Street Journal showing the company knew its platforms worsened depression, eating disorders and suicidal thoughts among teenage girls.

In this case, the complaint includes sections of an internal document presented to Zuckerberg raising concerns that image filters that simulated the effects of plastic surgery were having harmful effects on the mental health of teenage girls.

Zuckerberg personally vetoed the proposed policy to ban such images, calling it “paternalistic,” the complaint says.

Experts have long charged the company with taking advantage of children, to their detriment. In May, a group of nearly 70 “top children’s rights advocates” wrote a letter to Zuckerberg where they outlined the known mental health risks to children of social media, and cautioned him about opening other platforms, like virtual reality, to children.

They wrote:

“Your business model relies on maximizing user engagement and time spent on your platform, regardless of the risks that poses to users of any age. As a result, users — including children and teens — are served harmful, attention-grabbing content that promotes alcohol, drugs, anorexia and unhealthy diets, and dangerous challenges.”

A March 2023 report from the Center for Countering Digital Hate found in addition to engagement on Meta’s platforms leading to a range of mental health issues, minors on some of Meta’s platforms also experienced other concrete harms. For example, many young people are “routinely exposed to harassment and abuse — including sexually explicit insults and racist, misogynistic, and homophobic harassment — and other offensive content.”

At a Senate subcommittee hearing in November, former Meta employee Arturo Bejar shared data indicating that between 13% and 24.4% of children ages 13-15 had received unwanted sexual advances.

A broader legal and regulatory strategy

In 2021 Meta announced it was developing an Instagram for Kids, although the idea faced immediate backlash from child development experts and members of Congress.

Soon after that, the whistleblower reports were made public and the documentary “The Social Dilemma” was released, which used insider testimony to sound the alarm about the effects of social media on kids’ and teens’ mental health.

Although Meta eventually scrapped plans for Instagram for Kids, Bonta announced a nationwide investigation into whether Meta, via Instagram, had deliberately designed a platform to addict children, knowing the harms.

This lawsuit is one outcome of that investigation. As part of this coordinated effort, attorneys general in nine other states are also filing lawsuits against Meta in their respective state courts, The Associated Press reported in October.

New York Times tech reporter Natasha Singer said the multi-state investigation and strategy is similar to the playbook used to attack Big Tobacco.

Dunckley said it is important that action be taken against these companies. “We can’t just keep hoping that social media companies will police themselves,” she said. “They won’t, and if they do, it won’t be enough.”

Meta has also faced lawsuits for privacy violations in the past. In 2019 the Federal Trade Commission (FTC) ordered the tech giant to change some of its practices and pay a record $5 billion fine for deceiving users about their ability to control their personal data.

That decision came amid a push in 2019 by the FTC to target Big Tech firms for a range of alleged anti-competitive practices. has also successfully filed complaints against Google and YouTubeAmazonMicrosoft and Epic Games.