TikTok Algorithm Pushing Suicidal Videos On Kids
The latest TikTok suicide exposes major issues with the app's algorithm.
The way social media works is as follows: If you interact with something, it’ll show you more of that thing. This is true to varying degrees for most social media, including Facebook, Instagram, and even Twitter. But TikTok’s algorithm is turned up to 11, and Bloomberg reports that it continuously shares suicide-related posts with vulnerable teens — a trend that came to light after Chase Nasca’s parents checked his TikTok account shortly after their boy committed suicide.
Chase Nasca’s mother, Michelle, was desperate to figure out why her 16-year-old son had killed himself back in 2022. She started opening the social media apps on his phone for some hint of his motivation. While she never found a motive, she reacted in horror after opening TikTok and being greeted by dozens of posts, suggested by TikTok’s algorithm, about being depressed and wanting to die.
Nasca’s death was eventually brought up in a congressional hearing that grilled TikTok CEO Shou Zi Chew about the app’s tendency to share content glorifying suicide, self-harm, and eating disorders to those who are specifically vulnerable to such content. During the hearing, Chew’s responses were generally seen as evasive, and both Democratic and Republican lawmakers hammered Chew for the content of his app (along with privacy concerns involving China‘s ability to view users’ data.)
TikTok’s spokesperson has said that the company strives “to provide a positive and enriching experience” to its users — the very users it’s pushing pro-suicide content to.
Nasca’s parents have since filed a wrongful death lawsuit against TikTok, alleging that the unsolicited suicide videos influenced their son’s decision to take his own life. They argue that their son, like other teenagers, are more susceptible to suggestion from this type of content.
This isn’t the first lawsuit that has been filed against TikTok surrounding someone who killed themself. A 10-year-old girl reportedly strangled herself when taking part in a so-called “Blackout Challenge,” which encourages people to strangle themselves until they pass out. However, a judge dismissed this lawsuit in December 2022, saying that Section 230 of the Communications Decency Act protects social media from being responsible for content created by its users.
It’s unclear whether Nasca’s parents will be successful in their lawsuit, which isn’t suing TikTok for having suicidal content but rather for having an algorithm that serves content about suicide to vulnerable users.
For its part, TikTok has a Trust and Safety team tasked with preventing harmful content, like the pro-suicide content found on Nasca’s account, from reaching users. However, former members of the team say that the company makes it almost impossible to do the job correctly. While they can review and delete posted content, they have no information about how TikTok’s algorithm works nor any control over what users see.
TikTok has said it’s made changes to prevent waves of content like this from taking over people’s feeds, but whatever changes they have made appear ineffective. Nasca’s parents can open up their late son’s TikTok account and still see a feed absolutely flooded with pro-suicide content.