Breaking 09:50 Air Canada suspends flights to Cuba as fuel crisis deepens 09:20 Mexico halts oil shipments to Cuba to avoid threatened US tariffs 09:03 US backs renewed UN-led efforts on Sahara after Madrid talks 09:00 Meta and Google face trial over alleged addiction of young users 08:50 Cuba suspends aircraft fuel supply for a month amid energy crisis 08:20 Russia accuses United States of abandoning proposed Ukraine peace plan 07:50 DP World chief exchanged emails with Jeffrey Epstein for years 18:50 Kremlin says talks underway to help Cuba amid stifling US sanctions 17:50 European banking alliance urges urgent alternatives to Visa and Mastercard 17:30 Sophie Adenot’s ISS mission delayed due to unfavorable weather conditions 17:20 Iran arrests reformist leaders as Khamenei calls for unity 16:50 Milan Cortina launches probe after Olympic medals crack and break 16:20 Yuan hits 33-month high after China urges banks to cut US Treasuries 15:50 Vance arrives in Armenia for first-ever US vice presidential visit 15:11 EXCLUSIVE Mohamed Chiker to Walaw: “The Sahara file is entering a phase of concrete implementation” 14:50 Epstein documents trigger wave of political resignations across Europe 14:30 Trump criticizes Team USA skier over political remarks 13:15 Four civilians, including a child, killed in Russian night attacks in Ukraine 13:00 Trump announces anticipated visit of China's Xi to the US later this year 12:50 Musk says Tesla Semi mass production is set for 2026 11:50 China urges banks to curb US Treasury exposure over risk concerns 11:30 Former Kosovo President Hashim Thaçi faces war crimes trial 11:20 Ilia Malinin lands first legal Olympic backflip in half a century 10:30 Prince William begins three-day official visit to Saudi Arabia 10:20 Michelangelo drawing sells for $27.2 million, shattering auction record

Grok issues apology after serious content moderation failure

Saturday 03 January 2026 - 15:00
By: Sahili Aya
Grok issues apology after serious content moderation failure

Grok, the artificial intelligence tool developed by Elon Musk’s company xAI, has issued a public apology following a major controversy linked to the circulation of illegal content generated by its image system. The incident sparked widespread criticism and renewed concerns over the safeguards used in generative AI technologies.

In a statement released on its platform, Grok acknowledged serious shortcomings in its safety mechanisms and admitted that existing protections had failed to prevent the creation of prohibited material. The company promised a review of its internal procedures and pledged to strengthen moderation systems to avoid similar incidents in the future.

The apology, however, was met with skepticism online. Many users argued that an automated system cannot take responsibility or express remorse, insisting that accountability lies with the developers and executives who designed, approved, and deployed the technology. Critics said the response appeared to shift blame away from human decision-makers.

Elon Musk and xAI’s leadership were also targeted by critics who said the apology should have come directly from the company’s management. According to these voices, responsibility for defining ethical standards, technical limits, and moderation policies rests with those overseeing the platform, not the software itself.

The incident has intensified calls for stricter oversight of AI tools. Experts and users alike warned that without deep technical changes and robust filters, similar failures could occur again. Some have even suggested suspending the tool until stronger guarantees are in place.

More broadly, the controversy has reignited debate over legal and ethical responsibility in artificial intelligence, particularly regarding the duty of AI creators to ensure their systems do not generate illegal or harmful content, especially when child protection is involved.


  • Fajr
  • Sunrise
  • Dhuhr
  • Asr
  • Maghrib
  • Isha

Read more

This website, walaw.press, uses cookies to provide you with a good browsing experience and to continuously improve our services. By continuing to browse this site, you agree to the use of these cookies.