Court documents reveal Meta's top safety officials internally criticized the company's plan to encrypt Facebook Messenger, warning it would severely hamper efforts to detect child exploitation. The revelations come from a New Mexico lawsuit alleging Meta failed to protect minors from predators on its platforms.

Internal company communications show that Meta’s senior executives moved forward with encrypting Facebook and Instagram messaging despite strong objections from safety officials who warned the change would severely limit the company’s capacity to identify and report child exploitation to authorities.
Court documents filed in a New Mexico lawsuit reveal that Monika Bickert, who leads Meta’s content policy division, expressed sharp criticism in March 2019 internal messages as CEO Mark Zuckerberg prepared to announce the encryption initiative.
“We are about to do a bad thing as a company. This is so irresponsible,” Bickert stated in the company chat.
These previously unreported documents became public Friday as part of a lawsuit filed by New Mexico Attorney General Raul Torrez. The case alleges that Meta provided predators with unrestricted access to minors and facilitated connections that resulted in actual abuse and human trafficking. This groundbreaking case against Meta is currently being heard by a jury.
The revelations surface amid mounting legal challenges and regulatory pressure worldwide concerning the protection of young users across Meta’s platforms.
Beyond the New Mexico litigation focused on alleged failures to prevent child predation, over 40 state attorneys general are pursuing separate claims that Meta’s services negatively impact youth mental health.
Multiple school systems have also filed lawsuits against the company, while Zuckerberg provided testimony last week in another case in Los Angeles County Superior Court involving a teenager allegedly harmed by Meta’s products.
The New Mexico court filing specifically challenges Meta’s public statements about the safety measures surrounding its decision to implement automatic end-to-end encryption for Facebook Messenger, initially announced in 2019 and later extended to Instagram direct messaging.
ELEVATED CONCERNS
End-to-end encryption technology ensures that messages are scrambled during transmission and can only be read by the intended recipient’s device. This privacy feature is commonly found in messaging platforms like Apple’s iMessage, Google Messages, and Meta’s WhatsApp service.
However, child protection organizations, including the National Center for Missing and Exploited Children (NCMEC), have raised concerns that implementing this technology within public social networks that easily connect children with strangers creates additional dangers.
The court filings demonstrate that Meta’s own safety leadership shared these concerns. While Zuckerberg publicly assured that the company was addressing potential risks, internal communications show top safety and policy executives voiced serious reservations, with Bickert criticizing what she called “gross misstatements of our ability to conduct safety operations.”
“I’m not very invested in helping him sell this, I must say,” Bickert wrote regarding Zuckerberg’s public promotion of encryption for privacy reasons. She noted that with end-to-end encryption, “there is no way to find the terror attack planning or child exploitation” and proactively alert law enforcement.
Internal company analysis from February 2019 projected that if Messenger had been encrypted the previous year, Meta’s reports of child nudity and sexual exploitation imagery to NCMEC would have dropped from 18.4 million to 6.4 million cases – a 65% reduction.
A subsequent version of the same analysis indicated Meta would have been “unable to provide data proactively to law enforcement in 600 child exploitation cases, 1,454 sextortion cases, 152 terrorist cases [and] 9 threatened school shootings.”
ENHANCED PROTECTION MEASURES
Meta representative Andy Stone responded to inquiries by explaining that the concerns raised by Bickert and Antigone Davis, Meta’s Global Head of Safety, prompted the company to develop enhanced safety tools before rolling out encrypted messaging for Facebook and Instagram in 2023.
Although messages are now encrypted automatically, users retain the ability to report problematic content to Meta for evaluation and potential law enforcement referral.
“The concerns raised in 2019 represent the very reason we developed a range of new safety features to help detect and prevent abuse, all designed to work in encrypted chats,” Stone explained.
The company’s protective measures included establishing specialized accounts for minors that block unknown adults from initiating contact with underage users.
Safety executives particularly highlighted concerns about children being targeted on Meta’s public social media platforms and then victimized through private messaging features.
“FB [Facebook] allows pedophiles to find each other and kids via social graph with easy transition to Messenger,” Davis wrote in a 2019 email evaluating the plan’s dangers.
She contrasted this with Meta’s existing encrypted WhatsApp service, noting it operates independently from social media platforms and therefore presents fewer risks.
“WA (WhatsApp) does not make it easy to make social connections, meaning making Messenger e2ee (end-to-end encrypted) will be far, far worse than anything we have seen/gotten a glimpse of on WA,” Davis stated.
Ukraine War Transformed as Small Drones Replace Tank Battles After Four Years
Sacramento Kings Break Historic 16-Game Losing Streak with Victory Over Memphis
Hyundai Executive Warns of Rising Tariff Threats Despite Court Setback