Court finds Meta enabled child exploitation and orders them to pay $375 million

Court finds Meta enabled child exploitation and orders them to pay $375 million

Court finds Meta enabled child exploitation and orders them to pay $375 million

A New Mexico jury has ordered Meta to pay $375 million in civil penalties after finding the company misled users about safety and enabled harm, including child sexual exploitation, on its platforms. The ruling marks a landmark moment in holding Big Tech accountable for prioritizing profit over the safety and wellbeing of children.

The case, brought by the New Mexico attorney general in December 2023, is the first jury trial to find Meta liable for harms facilitated through its platforms. At its core, the lawsuit exposed how design choices and corporate decisions created environments where children could be targeted, groomed, and exploited.

Profit over protection

The jury imposed the maximum penalty of $5,000 per violation under the state’s Unfair Practices Act. Jurors agreed that Meta knowingly misrepresented the safety of its platforms while failing to act on mounting evidence of harm.

New Mexico attorney general Raúl Torrez framed the verdict as a turning point in the fight to protect children online. As reported by The Guardian, Torrez states:

The jury’s verdict is a historic victory for every child and family who has paid the price for Meta’s choice to put profits over kids’ safety.

Meta executives knew their products harmed children, disregarded warnings from their own employees, and lied to the public about what they knew. Today the jury joined families, educators, and child safety experts in saying enough is enough.

Internal documents and testimony revealed that company leaders understood the risks. Employees and external child safety experts repeatedly warned that Facebook and Instagram were facilitating child sexual exploitation. Despite this, executives failed to implement adequate safeguards.

In depositions shown during the trial, Meta executives, including CEO Mark Zuckerberg and Instagram head Adam Mosseri, acknowledged that harms such as sexual exploitation were inevitable given the scale of their platforms. Critics argue that this acceptance reflects a business model that tolerates harm as a cost of growth.

Platforms that enable abuse

The trial detailed how predators used Meta’s platforms to groom minors and share child sexual abuse material. Evidence included a 2024 undercover operation, “Operation MetaPhile”. The operation resulted in three men were arrested after attempting to meet children they had contacted online.

Investigators and experts from the National Center for Missing and Exploited Children testified that Meta’s reporting of crimes taking place on its platforms, including the exchange of child sexual abuse material, was deficient. According to investigators,

Meta has generated high volumes of “junk” reports by overly relying on AI to moderate its platforms…. These reports were useless to law enforcement and meant crimes could not be investigated.

At the same time, Meta’s 2023 decision to encrypt Facebook Messenger limited access to critical evidence. While encryption can enhance privacy, the court heard that it also shielded predators and obstructed investigations into child exploitation.

These failures highlight a broader issue: platforms designed to maximize engagement and growth can also create fertile ground for exploitation when safety is treated as secondary.

Tell me more