A lawsuit against Meta (formerly Facebook) alleges that the company violated the Children’s Online Privacy Protection Rule (COPPA) and state-based consumer protection statutes by collecting personal information of children under 13 without parental consent.
Driving the news: The lawsuit claims that Meta knowingly refused to shut down Instagram accounts belonging to children under 13, despite receiving over a million reports of underage users from parents, friends, and online community members between 2019 and 2023.
- Attorneys general from 33 states are calling for court orders prohibiting Meta from these alleged violations, with potential civil penalties amounting to hundreds of millions of dollars and fines per violation.
- The lawsuit also alleges that Meta knew its algorithm could steer children toward harmful content, contributing to a national youth mental health crisis. Meta is accused of intentionally designing its platforms with manipulative features that make children addicted to their platforms while lowering their self-esteem.
What they’re saying: Meta denies the allegations and says it has over 30 tools to support safe age-appropriate experiences online. The company supports federal legislation that requires app stores to get parents’ approval whenever their teens under 16 download apps to verify their age without ID verification.
- “Meta has profited from children’s pain by intentionally designing its platforms with manipulative features that make children addicted to their platforms while lowering their self-esteem,” said Letitia James, attorney general for New York, in a statement last month.