Mark Zuckerberg has offered apologies to lawmakers and grieving parents, but critics argue his actions at Meta tell a different story. Two years after a Senate hearing where he turned to face families who lost children to platform-related harms, the company's internal documents and whistleblower accounts paint a picture of a leadership team that consistently puts profit ahead of protecting minors.
Haley McNamara, executive director of the National Center on Sexual Exploitation, writes that under Zuckerberg's direction, Meta's platforms—Facebook, Instagram, Messenger, and WhatsApp—have become breeding grounds for child sexual abuse, grooming, sextortion, and trafficking. A top Meta child-safety researcher internally warned executives of roughly 500,000 daily cases of minors receiving sexually exploitative messages on Facebook and Instagram, even when counting only English-language markets. “We expect the true situation is worse,” the researcher noted.
Instagram's algorithms recommended 1.4 million potentially dangerous adults to teens in a single day. Meta's policy allowed up to 17 strikes before suspending accounts flagged for sex trafficking. Its AI chatbot was built with guidelines permitting “romantic or sensual” conversations with minors, and a child safety group found only 17 percent of Instagram's teen safety features actually worked. The scale of the crisis, McNamara argues, amounts to a national emergency that fills stadiums with exploited children every day.
McNamara, who served as an expert witness in New Mexico's recent lawsuit against Meta, describes years of warnings to the company about dangerous design features. “Meta met our efforts with delays, excuses, and half-measures,” she writes. Internal communications exposed by Reuters show one employee stating that “child safety is an explicit non-goal,” while another employee, frustrated over Zuckerberg's rejection of parental controls for AI chatbots, reportedly asked, “Is he f***ing nuts?”
Whistleblower Cayce Savage previously stated that “Meta has spent the time and money it could have spent making its products safer [on] shielding itself instead.” Indeed, Zuckerberg has poured $65 million into lobbying efforts even as the company faces mounting scrutiny. Critics say the problem is not technological but a matter of choice: Zuckerberg, as founder, CEO, chairman, and controlling majority stakeholder, could radically reduce online child sexual exploitation by ensuring strangers cannot find and contact children on his platforms.
Such changes would likely reduce engagement and user numbers, but McNamara argues that for a man already among the world's richest and most powerful, the trade-off is worthwhile. “Until Zuckerberg changes his priorities and chooses a better legacy, children will be abused on his platforms,” she writes. The responsibility, she concludes, stops at his desk.
For context, the persistence of such systemic failures echoes broader patterns of institutional neglect, as seen in recent reports on how major outlets distorted a child abduction case with gender ideology. Meanwhile, ongoing crises like the cartel loophole that fuels child smuggling despite border crackdowns underscore the multifaceted threats facing vulnerable populations. These patterns suggest that without fundamental shifts in leadership priorities, exploitation will continue to thrive.
