New Mexico jury finds Meta violated consumer protection law in landmark child safety trial

A New Mexico jury ordered Meta to pay $375M for violating state consumer protection law by concealing risks to children on Instagram, Facebook, and WhatsApp.

A New Mexico jury found on Tuesday, 24 March 2026, that Meta Platforms violated state consumer protection law by failing to disclose risks to children posed by its social media platforms. The verdict, reached after a nearly seven-week trial in Santa Fe, New Mexico, marks the first time a jury in the United States has ruled against a major social media company on such claims. The jury ordered Meta to pay 375 million United States dollars in civil penalties under New Mexico’s Unfair Practices Act.

The jury determined that Meta, the parent company of Instagram, Facebook, and WhatsApp, made false or misleading statements about platform safety and engaged in trade practices that were unconscionable because they exploited the vulnerabilities and inexperience of children. Jurors identified thousands of individual violations, each counted separately under New Mexico state law at the maximum statutory rate of 5,000 dollars per violation.

New Mexico Attorney General Raul Torrez initiated the lawsuit against Meta in 2023 following an undercover investigation conducted by the New Mexico Department of Justice. State agents created accounts on Facebook and Instagram posing as users younger than 14 to document sexual solicitations and to observe how Meta responded to those interactions. The accounts received sexually explicit material and were contacted by adults seeking similar content, generating evidence that led to criminal charges against multiple individuals, according to the office of the attorney general.

How New Mexico prosecutors argued Meta concealed child safety risks for more than a decade

The state’s case rested on the argument that Meta had accumulated detailed internal knowledge of the dangers its platforms posed to children but had concealed that knowledge from regulators, users, and the public while publicly asserting that its platforms were safe for young people. Internal company documents and witness testimony introduced at trial showed that Meta employees and outside child safety experts had repeatedly warned company leadership about those dangers without the company taking sufficient remedial action.

Prosecutors identified specific platform design features as mechanisms of harm to minors. Infinite scroll, auto-play video, and algorithmic content recommendation systems were cited as features that extended the duration of young users’ engagement on Meta platforms and fostered addictive behavior linked to depression, anxiety, and self-harm. The state alleged Meta continued to deploy those features despite internal evidence of harm because they increased user engagement and, correspondingly, advertising revenue.

In closing arguments on 23 March 2026, Linda Singer, an attorney for the state of New Mexico, told the jury that Meta had failed over the course of a decade to act honestly and transparently and had failed to protect young people in the state of New Mexico. Singer told the jury it was within its authority to award more than 2 billion dollars in damages. State Chief Deputy Attorney General James Grayson characterized the case as being about one of the largest technology companies in the world taking advantage of New Mexico teenagers.

See also  Catastrophic wildfires ravage central Valparaiso: Chile declares state of emergency

What internal Meta documents revealed about child sexual abuse material and Facebook Messenger encryption

Among the most significant evidence introduced at trial were internal communications from Meta employees that discussed the consequences of Chief Executive Officer Mark Zuckerberg’s 2019 announcement to make Facebook Messenger end-to-end encrypted by default. Those communications, introduced through legal filings by New Mexico prosecutors, discussed the impact of end-to-end encryption on Meta’s ability to detect and report child sexual abuse material to law enforcement. The internal materials indicated that approximately 7.5 million child sexual abuse material reports could be affected.

End-to-end encryption prevents third parties, including platform operators and law enforcement agencies, from reading the content of private messages. The New Mexico Department of Justice argued that Meta’s decision to implement end-to-end encryption across Facebook Messenger without adequate countermeasures for detecting and reporting child sexual abuse material represented a deliberate choice that subordinated child safety to product strategy.

What witnesses and evidence were presented during the seven-week New Mexico trial against Meta

The New Mexico trial examined a substantial volume of Meta’s internal correspondence and reports relating to child safety across the company’s platforms. Jurors heard testimony from Meta executives, platform engineers, former employees who described themselves as whistleblowers, psychiatric experts, and technology safety consultants. Public school educators in New Mexico also testified, describing disruptions in their schools linked to social media use, including sextortion schemes that targeted children on Meta platforms.

In reaching its verdict, the jury considered whether social media users were misled by specific public statements made by Meta Chief Executive Officer Mark Zuckerberg, Instagram head Adam Mosseri, and Meta global head of safety Antigone Davis. During deliberations, jurors worked from a checklist of allegations prepared by prosecutors. That checklist covered claims that Meta failed to disclose what it knew about problems with enforcing its ban on users under the age of 13, the prevalence of content on its platforms relating to teen suicide, and the role of Meta’s recommendation algorithms in prioritizing sensational or harmful content.

How Meta defended itself against New Mexico’s consumer protection claims in closing arguments

Meta mounted a defence based on arguments of transparency, federal legal protection, and the company’s affirmative safety investments. Meta attorney Kevin Huff told the jury that the evidence showed the company’s robust disclosures and efforts to prevent harmful content, and that those disclosures established Meta had not knowingly and intentionally misled the public. Huff stated that Meta designed its applications to help people connect with friends and family, not to connect predators, and that the company invests in safety both because it is the right thing to do and because it is good for business.

See also  FSU mass shooting suspect identified as sheriff’s deputy’s son, used her gun in attack

Meta also invoked federal legal protections, arguing it is shielded from liability under Section 230 of the Communications Decency Act, a federal statute that generally bars civil lawsuits against websites over user-generated content, and under the free-speech provisions of the First Amendment to the United States Constitution. The company argued that claims of harm arising from content on its platforms cannot be separated from that content, which is generated by users rather than by Meta.

Following the verdict, Meta spokesperson Andy Stone said in a statement that the company respectfully disagreed with the verdict and would appeal. Stone said Meta works hard to keep people safe on its platforms and is clear about the challenges of identifying and removing bad actors or harmful content, and that the company would continue to defend itself vigorously and remained confident in its record of protecting teenagers online.

What the New Mexico verdict means for the broader wave of child safety litigation against social media companies

The New Mexico verdict makes the state the first in the nation to secure a jury ruling against a major technology company on claims related to the harm its platforms cause to young people. The decision carries significance beyond New Mexico because it demonstrates that state consumer protection statutes can be applied against social media companies in the context of child safety and that federal protections under Section 230 of the Communications Decency Act do not automatically immunize such companies from state law claims.

The New Mexico case is one of several social media trials proceeding in the United States in 2026. At the time of the New Mexico verdict, a separate jury in a federal court in Southern California was deliberating in one of three bellwether cases, described by observers as potentially setting the course for thousands of similar personal injury lawsuits against Meta and Google’s YouTube. A third federal trial in the Northern District of California was scheduled to commence later in 2026.

More than 40 state attorneys general in the United States have filed lawsuits against Meta, arguing that the company deliberately designed Instagram and Facebook with features that are addictive to young people and that its conduct has contributed to a nationwide mental health crisis among minors. The wave of litigation has been compared by legal observers to the state-level tobacco litigation of the 1990s, in which state governments alleged tobacco companies misled the public about the health effects of their products.

See also  Can restaurant SaaS go private to scale faster? Olo’s $2bn Thoma Bravo deal raises the bar

The second phase of the New Mexico trial, a bench trial before Judge Bryan Biedscheid without a jury, is scheduled for 4 May 2026. In that phase, the court will determine whether Meta created a public nuisance and whether the company should be required to fund public programs to address the alleged harms. The state’s attorneys have also urged Meta to implement effective age verification, remove predators from its platforms, and protect minors from encrypted communications that shield bad actors.

Attorney General Raul Torrez described the verdict as a historic victory for every child and family who had paid the price for Meta’s choice to put profits over children’s safety. Torrez said Meta executives knew their products harmed children, disregarded warnings from their own employees, and were not truthful with the public about what they knew. [SINGLE SOURCE – VERIFY]

Key takeaways on what the New Mexico verdict means for Meta, children’s safety, and social media regulation

  • A New Mexico jury on 24 March 2026 found Meta Platforms liable for violating New Mexico’s Unfair Practices Act by concealing known risks of child sexual exploitation and mental health harm on Instagram, Facebook, and WhatsApp, ordering the company to pay 375 million United States dollars in civil penalties.
  • The verdict marks the first time a jury in the United States has ruled against a major social media company on consumer protection claims tied to child safety, setting a legal precedent that state laws can impose liability independent of federal Section 230 protections.
  • Meta has announced it will appeal the verdict. A second phase of the New Mexico trial, a bench proceeding before Judge Bryan Biedscheid, is scheduled for 4 May 2026 and will determine whether Meta created a public nuisance and owes further remedies under New Mexico law.
  • The New Mexico case is one of multiple social media trials proceeding in United States courts in 2026, including a bellwether federal trial in Southern California and a separate federal trial in the Northern District of California, collectively addressing thousands of lawsuits alleging that Meta and other social media companies designed addictive platforms that harmed children.
  • More than 40 state attorneys general in the United States have active lawsuits against Meta, reflecting a coordinated legal strategy by state governments that observers have compared to the multistate tobacco litigation of the 1990s in its scope and potential implications for the social media industry.

Discover more from Business-News-Today.com

Subscribe to get the latest posts sent to your email.

Total
0
Shares
Leave a Reply

Your email address will not be published. Required fields are marked *

Related Posts