Subscriber Benefit
As a subscriber you can listen to articles at work, in the car, or while you work out. Subscribe NowEmployees at Facebook parent Meta Platforms and TikTok developer ByteDance were aware of the harmful effects of their platforms on young children and teenagers but disregarded the information or in some cases sought to undermine it, court filings show.
The revelations were disclosed in a lawsuit over social media addiction that had been filed previously but with key portions sealed from public view. An unredacted version filed over the weekend in federal court in Oakland, California, offers details about how much engineers and others, including Meta CEO Mark Zuckerberg, knew about the harms of social media and their misgivings about it.
“No one wakes up thinking they want to maximize the number of times they open Instagram that day,” one Meta employee wrote in 2021, according to the filing. “But that’s exactly what our product teams are trying to do.”
The case in Oakland comprises a collection of scores of complaints filed across the United States on behalf of adolescents and young adults who allege that Facebook, Instagram, TikTok, Snapchat and Google’s YouTube caused them to suffer anxiety, depression, eating disorders and sleeplessness. More than a dozen suicides also have been blamed on the companies, based on claims that they knowingly designed algorithms that drew children down dangerous and addictive paths. Several public school districts have filed suits, too, alleging they can’t fulfill their educational mission while students are coping with mental-health crises.
In their defense, the social media giants point to a 1996 law that gives internet platforms broad immunity from claims over harmful content posted by users. Both sides are watching closely a Supreme Court case that will likely determine the fate of the litigation in Oakland.
According to the new filing, internal documents at TikTok parent ByteDance show that the company knows young people are more susceptible to being lured into trying dangerous stunts they view on the platform—known as viral challenges—because their ability to weigh risk isn’t fully formed.
Young people are more likely to “overestimate their ability to cope with risk,” and their “ability to understand the finality of death is also not fully fledged,” according to the filing.
Another unsealed portion of the filing contends that instead of moving to address the problems around children using Instagram and Facebook, Meta defunded its mental health team.
The filing says Zuckerberg was personally warned: “We are not on track to succeed for our core well-being topics (problematic use, bullying & harassment, connections, and SSI), and are at increased regulatory risk and external criticism. These affect everyone, especially Youth and Creators; if not addressed, these will follow us into the Metaverse.”
Snap had no immediate comment on the court filing. Representatives of Meta and TikTok didn’t immediately respond to requests for comment.
The companies have previously said that user safety is a priority and that they have taken affirmative steps to give parents more control over their children’s use of the platforms and to provide more mental health resources.
Please enable JavaScript to view this content.
The law is firmly on the side of Big Tech, and courts consistently agree.
If you want Big Tech to respect privacy, children’s mental health, or have effective moderation of content, they must be regulated (and many of these companies will need to fail or break apart)