
Meta allegedly stopped internal research on social media’s impact on people after finding negative results, a court filing released Friday claims.
The filing took place in a Northern California District Court, as a group of U.S. state attorneys general, school districts, and parents launched a suit against Meta, Google-owned YouTube, TikTok, and Snap.
The court documents allege that Meta misled the public on the mental health risks to children and young adults who excessively use Facebook and Instagram, even though its research showed that the social media apps had demonstrated harm.
“The company never publicly disclosed the results of its deactivation study,” the lawsuit says. “Instead, Meta lied to Congress about what it knew.”
The research, code-named “Project Mercury,” took place in 2020. Meta scientists worked with survey firm Nielsen to see what impact deactivating Facebook had on people. According to internal documents, “people who stopped using Facebook for a week reported lower feelings of depression, anxiety, loneliness, and social comparison.” According to the filings, instead of pursuing more research, Meta dropped the project, claiming that participants’ feedback was biased by “the result of the existing media narrative around the company.”
Politico reported that in a sealed deposition earlier this year, Meta’s employees expressed concern about the research’s findings. “Oh my gosh, y’all. IG is a drug,” Shayli Jimenez, a Meta senior researcher, is quoted as saying in internal documents. In response, another employee allegedly said, “We’re basically pushers.” The Politico story reported that Jimenez said in her deposition that the comments were made “sarcastically.”
In a statement, Meta spokesperson Andy Stone said: “We strongly disagree” with the allegations, “which rely on cherry-picked quotes and misinformed opinions in an attempt to present a deliberately misleading picture.” Stone continued: “The full record will show that for over a decade, we have listened to parents, researched issues that matter most, and made real changes to protect teens—like introducing Teen Accounts with built-in protections and providing parents with controls to manage their teens’ experiences.”
And in a series of posts on BlueSky, Stone also pushed back against the idea that Meta was trying to bury the results of the terminated study with Nielsen, noting that the study found that people who believed using Facebook was bad for them felt better when they stopped using it. “It makes intuitive sense, but it doesn’t show anything about the actual effect of using the platform,” Stone wrote.
However, the latest uproar over Meta’s research is hardly the first time the company’s impact on children’s mental health has been questioned—even by its own employees. In 2021, former Facebook product manager Frances Haugen leaked hundreds of internal company documents to the government, which referenced risks to children. Haugen said the company’s leadership knows how to make Facebook and Instagram safer but refuses to “because they have put their astronomical profits before people.”
A growing body of evidence, outside of the company’s own research, has long pointed to the harm that social media may have on children’s mental health. A 2019 study found that teens who spent more than three hours a day on social media “may be at heightened risk for mental health problems, particularly internalizing problems.” Likewise, research shows that mental health disorders among today’s youth are at an all-time high and growing.
In response to growing concern around children’s mental health, in a 2023 report, then-U.S. Surgeon General Vivek Murthy called on social media companies and policymakers to act, rather than to place the entire burden of limiting kids’ time on social media on parents.
The final deadline for Fast Company’s World Changing Ideas Awards is Friday, December 12, at 11:59 p.m. PT. Apply today.



