seven-french-families-sue-tiktok,-alleging-platforms-harmful-content-led-to-teen-suicides

Seven French families have filed a lawsuit against TikTok, alleging that the social media platform’s algorithm exposed their teenage children to harmful content, leading two of them to take their own lives at the age of 15.

The case, filed in the Créteil judicial court, claims that TikTok’s algorithm promoted videos encouraging suicide, self-harm, and eating disorders. According to Laure Boutron-Marmion, the families’ lawyer, this is the first collective lawsuit of its kind in Europe targeting TikTok’s accountability.

“The parents want TikTok’s legal liability to be recognised in court,” Boutron-Marmion told broadcaster franceinfo, arguing that the company, by offering its product to a young audience, should be held accountable for any harmful impacts. The lawsuit contends that TikTok’s content moderation failed to safeguard underage users from dangerous material.

TikTok, along with other social media platforms, has faced significant scrutiny over the regulation of content and its effects on young users. In the United States, both TikTok and Meta’s platforms—Facebook and Instagram—are the subjects of hundreds of lawsuits, claiming that the platforms’ algorithms have caused mental health issues by addicting and negatively influencing millions of children.

TikTok did not immediately respond to requests for comment on the lawsuit. However, the company has previously stated that it prioritises issues related to children’s mental health. Earlier this year, CEO Shou Zi Chew assured U.S. lawmakers that TikTok has invested in initiatives to protect young users.

Melissa Enoch

Follow us on:

About Author

Related Post