Meta Hit with $375 Million Penalty in Landmark New Mexico Child Safety Case
A New Mexico jury delivered a significant blow to Meta Platforms on Tuesday, ruling that the tech giant violated state consumer protection laws in a high-profile lawsuit. The case, brought by New Mexico Attorney General Raúl Torrez, accused Meta of misleading users about the safety of its platforms—Facebook, Instagram, and WhatsApp—and enabling child sexual exploitation. The jury ordered Meta to pay $375 million in civil penalties, marking the first jury verdict on such claims against the company.
Verdict and Meta's Response
The six-week trial concluded with the jury finding that Meta's actions breached New Mexico's consumer protection statutes. In response, a Meta spokesperson stated, "We respectfully disagree with the verdict and will appeal." The company emphasized its efforts to maintain safety, noting, "We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors or harmful content." Representatives for the New Mexico attorney general did not immediately comment on the outcome.
Background and Allegations
The lawsuit stemmed from an undercover operation conducted by Attorney General Torrez's office in 2023. Investigators created accounts on Facebook and Instagram posing as users under 14, which received sexually explicit material and were contacted by adults seeking similar content. This led to criminal charges against multiple individuals. The state alleged that Meta falsely claimed its platforms were safe for children while concealing internal documents acknowledging issues with sexual exploitation and mental health harm.
Key accusations included:
- Allowing predators unfettered access to underage users, potentially leading to real-world abuse and human trafficking.
- Failing to implement basic safety tools, such as age verification, despite known risks.
- Designing features like infinite scroll and auto-play videos to maximize engagement, which the lawsuit claims fosters addictive behavior linked to depression, anxiety, and self-harm in young people.
Broader Legal and Regulatory Challenges
This verdict comes as Meta faces increasing scrutiny over its handling of child and teen safety. Whistleblower testimony before Congress in 2021 alleged that Meta knew its products could be harmful but refused to act. Additionally, the company is embroiled in thousands of lawsuits nationwide, accusing it and other social media firms of intentionally designing addictive products that contribute to a youth mental health crisis. Some of these suits seek damages in the tens of billions of dollars.
Meta has defended itself by arguing that it is shielded from liability under the First Amendment and Section 230 of the Communications Decency Act, which generally protects websites from lawsuits over user-generated content. The company contends that allegations of harm cannot be separated from platform content, as its algorithms and design features serve to publish that content.
Trial Details and Future Proceedings
During closing arguments, state attorney Linda Singer told the jury that Meta had "failed over and over again to act honestly and transparently" and could award over $2 billion in damages. Meta's attorney, Kevin Huff, countered that evidence showed "Meta's robust disclosures and tireless efforts to prevent harmful content," asserting the company did not knowingly lie to the public.
Looking ahead, Judge Bryan Biedscheid is scheduled to hold a bench trial in May on the state's claims that Meta created a public nuisance harming residents' health and safety. The state will seek court-ordered changes to Meta's platforms to align with state law, potentially mandating enhanced safety measures for children.
This case underscores the growing legal and public pressure on social media companies to address safety concerns, particularly regarding young users, as regulators and courts grapple with the complex interplay of technology, free speech, and consumer protection.



