- 08:31PKK announces disbandment and end of armed conflict with Turkey
- 08:20Algeria’s covert crackdown in France deepens diplomatic rift
- 08:01Trump’s Kashmir Comments Spark Backlash Amid Fragile India-Pakistan Ceasefire
- 07:50France backs green agriculture push in Morocco with €350M AFD-OCP deal
- 07:30Hamas Says It Will Release US-Israeli Captive Edan Alexander
- 16:00Trump Offers Mediation in Kashmir Conflict as India and Pakistan Observe Ceasefire
- 15:38The invaluable chalice used by Pope Leo XIV in his first Mass
- 15:08Energean strengthens offshore gas presence in Morocco
- 14:37Dani Carvajal: Madrid's growth is in the hands of its people
Follow us on Facebook
Social media platform faces legal action after UK children's deaths
Four British families have initiated legal proceedings against TikTok in Delaware's Superior Court, linking their children's deaths to a dangerous viral trend called the "blackout challenge." The lawsuit, filed Thursday, addresses the deaths of four minors: Isaac Kenevan, 13, Archie Battersbee, 12, Julian "Jools" Sweeney, 14, and Maia Walsh, who reportedly lost their lives in 2022 after participating in the hazardous online challenge.
The Social Media Victims Law Center, representing the families, has targeted ByteDance, TikTok's parent company, alleging its platform deliberately creates addictive patterns that expose children to harmful content. The lawsuit contends that users receive dangerous content recommendations regardless of their search preferences.
Lisa Kenevan, mother of Isaac, expressed frustration with TikTok's response, describing it as lacking empathy and relying on standardized corporate messaging. The families aim to establish accountability and prevent future tragedies through their legal action.
In response, TikTok maintains it actively combats harmful content, citing measures such as blocking searches related to the challenge and removing 99% of rule-violating content before user reports. The platform also addressed data retention policies, explaining that personal information is deleted unless specifically requested by law enforcement.
The case highlights growing concerns about social media's influence on youth safety, particularly regarding algorithm-driven content distribution that potentially prioritizes user engagement over protective measures.
Comments (0)