- 16:40Hate speech trial of Isabel Peralta: Neonazi leader faces charges for inciting violence against migrants
- 16:15Morocco enhances water security with new desalination plant in Sidi Ifni
- 15:45Morocco to Sight the Crescent Moon for Eid Al Fitr on Sunday
- 15:10The return of flamingos to Merzouga: A natural spectacle
- 14:30Over 1,000 Confirmed Dead in 7.7 Magnitude Earthquake in Myanmar
- 13:50Spain, Morocco, and Portugal launch joint bid for 2035 FIFA Women’s World Cup
- 13:20Elon Musk's xAI acquires X, signaling a transformative merger
- 12:50EU urges citizens to prepare for potential crises
- 12:20Tangier ranked among top 10 tourist destinations for 2025
Follow us on Facebook
Social media platform faces legal action after UK children's deaths
Four British families have initiated legal proceedings against TikTok in Delaware's Superior Court, linking their children's deaths to a dangerous viral trend called the "blackout challenge." The lawsuit, filed Thursday, addresses the deaths of four minors: Isaac Kenevan, 13, Archie Battersbee, 12, Julian "Jools" Sweeney, 14, and Maia Walsh, who reportedly lost their lives in 2022 after participating in the hazardous online challenge.
The Social Media Victims Law Center, representing the families, has targeted ByteDance, TikTok's parent company, alleging its platform deliberately creates addictive patterns that expose children to harmful content. The lawsuit contends that users receive dangerous content recommendations regardless of their search preferences.
Lisa Kenevan, mother of Isaac, expressed frustration with TikTok's response, describing it as lacking empathy and relying on standardized corporate messaging. The families aim to establish accountability and prevent future tragedies through their legal action.
In response, TikTok maintains it actively combats harmful content, citing measures such as blocking searches related to the challenge and removing 99% of rule-violating content before user reports. The platform also addressed data retention policies, explaining that personal information is deleted unless specifically requested by law enforcement.
The case highlights growing concerns about social media's influence on youth safety, particularly regarding algorithm-driven content distribution that potentially prioritizes user engagement over protective measures.
Comments (0)