US court filings claim Meta hid evidence of Facebook’s mental health harms

Meta was reportedly aware of the negative impacts of its platform but acted in a manner that minimized risks to young users.

California: In a major discovery, unredacted files in a lawsuit against top social media platforms revealed Meta stopped internal research, which allegedly found that users had lower rates of anxiety and depression when they deactivated their Facebook and Instagram accounts.

In 2020, Meta scientists conducted an internal research, code-named “Project Mercury,” to understand the effect of deactivating Facebook, according to Meta documents. The finding was that “people who stopped using Facebook for a week reported lower feelings of depression, anxiety, loneliness, and social comparison.”

However, these results were not published to the public or even pursued further. Meta shut it down and internally declared that the study findings were impacted by the company’s “existing media narrative,” which was negative.

Add as a preferred source on Google
Memory Khan Seminar

Some Meta employees in the filing state that they were uncomfortable with the results. One employee compared it to the tobacco industry “doing research and knowing cigarettes were bad and then keeping that info to themselves.”

Moreover, Time‘s report on the Seven Allegations Against Meta in Newly Unsealed Filings unveiled that sex trafficking on Meta platforms was “widely tolerated” and hard to flag.

Instagram‘s former head of safety and well-being, Vaishanvi Jayakumar, testified in the lawsuit that upon her joining the company, she was surprised to learn that Meta had a 17 times strike policy for accounts engaging in “trafficking of humans for sex.”

“You could incur 16 violations for prostitution and sexual solicitation, and upon the 17th violation, your account would be suspended,” Jayakumar reportedly testified.

The brief was filed in the Northern District of California by Motley Rice, a law firm suing Meta, Google, TikTok, and Snapchat on behalf of school districts nationwide.

Meta was reportedly aware of the negative impacts of its platform but acted in a manner that minimised risks to young users.

Germanten Hospital

One of the major reasons for the company to halt the research and “downplay” the effect of these harmful features was a growing fear that there would be a decline in the young teenage users.

The brief also accuses the company of failing to disclose these implications to Congress and of refusing to implement safety measures that could have protected its younger users.

In December 2020, Congress had asked whether Facebook had any ability to determine whether higher use of its platform had any correlation to increased signs of depression and anxiety within the demographic of teenage girls.

Meta had said “No.”

Meta’s response

Nevertheless, Meta spokesperson Andy Stone stated, “The study found that people who believed using Facebook was bad for them felt better when they stopped.”

Stone wrote on X that this was an “expectation effect.” (When one’s mindset shapes or influences everything)

He added that the study did not continue because their staged approach to overcome expectation effects “unfortunately” did not work.

Most of the allegations against other social media platforms were dismissed due to Section 230 of the Communications Decency Act of the US. It gives online service providers and users limited immunity from liability for content created by others.

While Meta introduced efforts to prevent exposure of a younger audience to inappropriate content, such as teenager accounts, the filing alleges that:

  • For years, it stalled internal efforts to prevent predators from contacting minors.
  • It pressured safety staff to spread information justifying the decision.
  • Meta intentionally designed its youth safety features to be mostly ineffective and rarely used. It also halted the testing of safety features that were deemed detrimental to its growth.
  • Another significant allegation was that Mark Zuckerberg, in a 2021 text message, said that child safety was not his top concern, “when I have a number of other areas I’m more focused on like building the metaverse.”

Stone denied all allegations, saying that the current policy in the company instantly removes accounts when they are reported for sex trafficking.

“We strongly disagree with these allegations, which rely on cherry-picked quotes and misinformed opinions,” their statement read.

The Meta documents in the filing are not public, and Meta has filed a motion to strike them. The Northern California District Court has scheduled a hearing on January 26 regarding the filing, Reuters reported.

News Desk

NewsDesk is our dedicated team of multimedia journalists at Siasat.com, delivering round-the-clock coverage of breaking news and events worldwide. As your trusted news source, NewsDesk provides verified updates on politics,… More »
Back to top button