- 11:30US greenlights $5.58 billion fighter jet sale to Philippines
- 11:20X-Links President warns of relocating Morocco-UK energy project amid UK delays
- 11:00Israel expands Gaza offensive amid territorial seizure plans
- 10:50Moroccan migrants face rising EU deportation orders
- 10:30US intensifies military presence in the Middle East amid Yemen strikes
- 10:20Marine Le Pen decries election ban as a severe blow to her 2027 ambitions
- 10:00Cory Booker Breaks Record for Longest Senate Speech Against Trump
- 09:50US acknowledges Syria’s new government amid ongoing sanctions
- 09:30Rescue mission turns into carnage in southern Gaza
Follow us on Facebook
Social media platform faces legal action after UK children's deaths
Four British families have initiated legal proceedings against TikTok in Delaware's Superior Court, linking their children's deaths to a dangerous viral trend called the "blackout challenge." The lawsuit, filed Thursday, addresses the deaths of four minors: Isaac Kenevan, 13, Archie Battersbee, 12, Julian "Jools" Sweeney, 14, and Maia Walsh, who reportedly lost their lives in 2022 after participating in the hazardous online challenge.
The Social Media Victims Law Center, representing the families, has targeted ByteDance, TikTok's parent company, alleging its platform deliberately creates addictive patterns that expose children to harmful content. The lawsuit contends that users receive dangerous content recommendations regardless of their search preferences.
Lisa Kenevan, mother of Isaac, expressed frustration with TikTok's response, describing it as lacking empathy and relying on standardized corporate messaging. The families aim to establish accountability and prevent future tragedies through their legal action.
In response, TikTok maintains it actively combats harmful content, citing measures such as blocking searches related to the challenge and removing 99% of rule-violating content before user reports. The platform also addressed data retention policies, explaining that personal information is deleted unless specifically requested by law enforcement.
The case highlights growing concerns about social media's influence on youth safety, particularly regarding algorithm-driven content distribution that potentially prioritizes user engagement over protective measures.
Comments (0)