Breaking 14:20 Iranian security chief meets Oman’s sultan as U.S. talks continue 13:50 United States and Canada reveal Olympic hockey line combinations in Milan 13:20 Winter Olympics spectators shed coats as Cortina reaches 4°C 13:00 China pledges support for Cuba as fuel shortages worsen 11:50 TSMC posts record January revenue as US weighs tariff exemptions 11:30 Robot dogs to assist Mexican police during 2026 World Cup 11:20 Macron warns of US pressure on EU and urges Europe to resist 11:00 Transparency International warns of worrying democratic decline 10:50 Honda quarterly operating profit plunges as tariffs and EV slowdown bite 09:50 Air Canada suspends flights to Cuba as fuel crisis deepens 09:20 Mexico halts oil shipments to Cuba to avoid threatened US tariffs 09:03 US backs renewed UN-led efforts on Sahara after Madrid talks 09:00 Meta and Google face trial over alleged addiction of young users 08:50 Cuba suspends aircraft fuel supply for a month amid energy crisis 08:20 Russia accuses United States of abandoning proposed Ukraine peace plan 07:50 DP World chief exchanged emails with Jeffrey Epstein for years 18:50 Kremlin says talks underway to help Cuba amid stifling US sanctions 17:50 European banking alliance urges urgent alternatives to Visa and Mastercard 17:30 Sophie Adenot’s ISS mission delayed due to unfavorable weather conditions 17:20 Iran arrests reformist leaders as Khamenei calls for unity 16:50 Milan Cortina launches probe after Olympic medals crack and break 16:20 Yuan hits 33-month high after China urges banks to cut US Treasuries 15:50 Vance arrives in Armenia for first-ever US vice presidential visit 15:11 EXCLUSIVE Mohamed Chiker to Walaw: “The Sahara file is entering a phase of concrete implementation” 14:50 Epstein documents trigger wave of political resignations across Europe 14:30 Trump criticizes Team USA skier over political remarks

Grok issues apology after serious content moderation failure

Saturday 03 January 2026 - 15:00
By: Sahili Aya
Grok issues apology after serious content moderation failure

Grok, the artificial intelligence tool developed by Elon Musk’s company xAI, has issued a public apology following a major controversy linked to the circulation of illegal content generated by its image system. The incident sparked widespread criticism and renewed concerns over the safeguards used in generative AI technologies.

In a statement released on its platform, Grok acknowledged serious shortcomings in its safety mechanisms and admitted that existing protections had failed to prevent the creation of prohibited material. The company promised a review of its internal procedures and pledged to strengthen moderation systems to avoid similar incidents in the future.

The apology, however, was met with skepticism online. Many users argued that an automated system cannot take responsibility or express remorse, insisting that accountability lies with the developers and executives who designed, approved, and deployed the technology. Critics said the response appeared to shift blame away from human decision-makers.

Elon Musk and xAI’s leadership were also targeted by critics who said the apology should have come directly from the company’s management. According to these voices, responsibility for defining ethical standards, technical limits, and moderation policies rests with those overseeing the platform, not the software itself.

The incident has intensified calls for stricter oversight of AI tools. Experts and users alike warned that without deep technical changes and robust filters, similar failures could occur again. Some have even suggested suspending the tool until stronger guarantees are in place.

More broadly, the controversy has reignited debate over legal and ethical responsibility in artificial intelligence, particularly regarding the duty of AI creators to ensure their systems do not generate illegal or harmful content, especially when child protection is involved.


  • Fajr
  • Sunrise
  • Dhuhr
  • Asr
  • Maghrib
  • Isha

Read more

This website, walaw.press, uses cookies to provide you with a good browsing experience and to continuously improve our services. By continuing to browse this site, you agree to the use of these cookies.