A jury in Santa Fe, New Mexico, has delivered a landmark decision: it found Meta liable for violating state consumer protection law by misleading users about safety on its platforms — including Facebook, Instagram, and WhatsApp — and failing to adequately protect children from sexual exploitation and other harms.
Civil Penalty and Basis of Ruling
The jury ordered Meta to pay $375 million in civil penalties, the maximum allowed under New Mexico’s Unfair Practices Act, based on thousands of individual violations.
Prosecutors argued that Meta’s platforms became environments where child sexual exploitation, explicit content, solicitation, and potential trafficking could flourish — and that the company misrepresented the safety of those platforms to the public.
This verdict is historic because it’s the first time a jury has held Meta accountable at trial for these specific kinds of harm, setting a precedent in U.S. litigation about tech companies’ responsibilities toward younger users.
Evidence and Claims Presented in Court
State’s Arguments
The New Mexico Attorney General’s office, led by Raúl Torrez, built its case around several core claims.
Meta allegedly prioritized engagement and profits over user safety, especially regarding minors.
Internal documents and testimony introduced at trial showed that even when employees and external experts warned about risks — including grooming, exploitation, and harmful content — leadership didn’t act sufficiently.
A 2023 undercover operation in which fake child profiles rapidly generated solicitations from adults was a compelling piece of evidence illustrating how predators could exploit the platforms.
Meta’s Response
Meta strongly disputed the claims both at trial and after the verdict. The company said it works hard to protect users and that identifying and removing harmful content remains a challenge for every large social platform. Meta also indicated it will appeal the verdict.
Meta’s legal team has framed many of the allegations as exaggerated or misleading, and noted ongoing investments in features like tools for teens, parental controls, and automated systems to detect harmful content.
Broader Context and Implications
Beyond New Mexico
The New Mexico case didn’t arise in isolation. It’s part of a wider wave of legal challenges facing Meta and other tech companies. Across the U.S., states, families, and school districts have filed lawsuits alleging that social media platforms harm youth — not just through exploitation, but also by contributing to mental health issues such as anxiety, depression, eating disorders, and harmful behavior patterns.
Next Legal Steps
A second phase of the New Mexico trial is scheduled for May, in a judge‑only proceeding. State prosecutors will argue that Meta created a public nuisance and should face additional penalties, potentially including structural reforms to how the platforms operate, such as improved age verification and stronger moderation requirements.
Potential Ripple Effects
Legal experts and child safety advocates are watching this closely.
Some believe the verdict will encourage other states to pursue similar cases. Others see it as part of a broader push for greater accountability and regulatory oversight of major tech platforms with millions of young users.
Human and Public Safety Concerns Highlighted
Beyond the legal and financial aspects, the rulings spotlight serious public safety questions about how social media affects children.
Prosecutors and advocates argued that design features like autoplay and algorithmic recommendations can expose young users to harmful content far too easily.
Law enforcement testimony during the trial highlighted challenges in investigating online exploitation, particularly where encrypted messaging and large volumes of automated abuse reports can inhibit response efforts.
This case has brought those issues to the forefront of public debate, prompting renewed calls for reforms, better protections for young people, and clearer accountability for powerful tech platforms.




