Bloomberg reports that a recently filed lawsuit alleges that China-owned TikTok’s algorithm purposefully steers violent content to black teens over white teens. The claim comes as part of a lawsuit over the death of a 14-year-old black girl named Englyn Roberts. The complaint further names Facebook, Snapchat, and TikTok’s parent company ByteDance as defendants.
The lawsuit is the latest in a long line blaming social media companies for teens becoming addicted to their platforms and products. The parents of Englyn Roberts, who died in September 2020 approximately two weeks after she attempted to take her own life, allege that TikTok is aware of biases in its algorithm relating to race and socioeconomic status.
The complaint, which was filed on Wednesday in San Francisco federal court, claims that Roberts would not have seen and become addicted to harmful content that contributed to her death if not for TikTok’s biased algorithm.
“TikTok’s social media product did direct and promote harmful and violent content in greater numbers to Englyn Roberts than what they promoted and amplified to other, Caucasian users of similar age, gender, and state of residence,” the lawsuit alleges.