Breaking 15:30 Turkey condemns arrest of two journalists in Tel Aviv 15:20 Morocco links new digital law to its push for global AI governance 14:56 Israeli-US strikes target Iran's Supreme Leader selection body 14:34 Emmanuel Macron to address French citizens amid Middle East tensions 14:30 Italy summons Iranian ambassador after drone strike on Cyprus 13:43 Asian nations rush to evacuate citizens and safeguard oil supplies amid Middle East conflict 13:35 Cornell imaging method reveals atomic scale defects in semiconductor chips 13:21 Middle East war escalates as Iran, Israel and US exchange strikes 13:05 US and Israel intensify air campaign in Iran as conflict enters fourth day 12:47 Lebanese media union condemns strikes on Al-Manar and Al-Nour as Hezbollah vows to continue broadcasting 12:02 UAE president walks through Dubai Mall during Iranian strikes 11:30 Eurozone inflation edges up to 1.9 percent as Iran conflict fuels energy shock 11:10 Eurozone inflation edges up to 1.9 percent as Iran conflict fuels energy shock 10:47 Oil producers outside Middle East conflict zone gain from market shock 10:21 US bombers hit Iran after nonstop flight from South Dakota 10:00 Airline stocks sink worldwide as Middle East conflict unleashes travel turmoil 09:40 European stocks sink as Iran conflict rattles trade and energy routes 09:19 Apple keeps iPhone 17e price at $599 while doubling storage and adding magsafe 08:50 Apple opens major spring launch with iphone 17e and new ipad air 08:20 Iranians navigate hope and fear after Khamenei’s killing 07:20 Iran launches sweeping cyber retaliation after US-Israeli strikes 07:00 US and Iran count rising toll as Epic Fury spreads across Middle East 22:49 Pakistani President condemns US-Israeli attacks on Iran

Grok issues apology after serious content moderation failure

Saturday 03 January 2026 - 15:00
By: Sahili Aya
Grok issues apology after serious content moderation failure

Grok, the artificial intelligence tool developed by Elon Musk’s company xAI, has issued a public apology following a major controversy linked to the circulation of illegal content generated by its image system. The incident sparked widespread criticism and renewed concerns over the safeguards used in generative AI technologies.

In a statement released on its platform, Grok acknowledged serious shortcomings in its safety mechanisms and admitted that existing protections had failed to prevent the creation of prohibited material. The company promised a review of its internal procedures and pledged to strengthen moderation systems to avoid similar incidents in the future.

The apology, however, was met with skepticism online. Many users argued that an automated system cannot take responsibility or express remorse, insisting that accountability lies with the developers and executives who designed, approved, and deployed the technology. Critics said the response appeared to shift blame away from human decision-makers.

Elon Musk and xAI’s leadership were also targeted by critics who said the apology should have come directly from the company’s management. According to these voices, responsibility for defining ethical standards, technical limits, and moderation policies rests with those overseeing the platform, not the software itself.

The incident has intensified calls for stricter oversight of AI tools. Experts and users alike warned that without deep technical changes and robust filters, similar failures could occur again. Some have even suggested suspending the tool until stronger guarantees are in place.

More broadly, the controversy has reignited debate over legal and ethical responsibility in artificial intelligence, particularly regarding the duty of AI creators to ensure their systems do not generate illegal or harmful content, especially when child protection is involved.


  • Fajr
  • Sunrise
  • Dhuhr
  • Asr
  • Maghrib
  • Isha

Read more

This website, walaw.press, uses cookies to provide you with a good browsing experience and to continuously improve our services. By continuing to browse this site, you agree to the use of these cookies.