Breaking 18:20 Iran seeks to reshape nuclear talks amid Gulf provocations 17:50 Xi holds calls with Putin and Trump ahead of nuclear treaty expiry 17:30 US diplomatic mission in Morocco resumes normal operations 17:20 Oil prices rise amid US-Iran tensions and US-India trade deal 16:50 Russia reaffirms offer to process Iran's enriched uranium 16:30 Chevron signs preliminary offshore exploration deal linked to Syria 16:00 Xi Jinping holds phone talks with Donald Trump 15:20 Musk becomes first person worth $800 billion after SpaceX-xAI merger 14:50 Volvo CEO predicts EVs cheaper than gas cars by 2030 14:20 Cathie Wood urges investors to swap gold for Bitcoin 13:50 Ukraine and Russia begin second round of US-mediated talks amid airstrikes 13:25 U.S. visa freeze faces legal challenge over nationality-based restrictions 13:00 US approves $3 billion f-15 maintenance services sale to Saudi Arabia 12:50 US-UK team develops real-time Arctic sea ice forecast model 12:30 Deaths in Ukraine's Dnipropetrovsk following Russian drone attacks 12:20 Chinese solar stocks surge after Musk team's visits to Jinko Solar 12:00 Türkiye reaffirms support for Sudan’s unity and humanitarian relief 11:50 United States and India boost mining ties after trade pact 11:20 Asian markets mixed as gold and oil rebound amid geopolitical tensions 09:00 Almost 200 separatists killed after attacks in Pakistan 08:50 Michael Burry warns bitcoin drop could trigger cascading losses 08:30 Zohran Mamdani: “New Yorkers are already dreaming of a Morocco–Brazil match” 08:20 NATO chief pledges instant troop deployment to Ukraine after peace deal 07:50 United States agrees to shift Iran nuclear talks to Oman amid drone incident 07:00 Stephen Miran steps down from Trump advisory role 18:50 Bitcoin plunges to 10-month low amid $2 billion liquidation wave

Grok issues apology after serious content moderation failure

Saturday 03 January 2026 - 15:00
By: Sahili Aya
Grok issues apology after serious content moderation failure

Grok, the artificial intelligence tool developed by Elon Musk’s company xAI, has issued a public apology following a major controversy linked to the circulation of illegal content generated by its image system. The incident sparked widespread criticism and renewed concerns over the safeguards used in generative AI technologies.

In a statement released on its platform, Grok acknowledged serious shortcomings in its safety mechanisms and admitted that existing protections had failed to prevent the creation of prohibited material. The company promised a review of its internal procedures and pledged to strengthen moderation systems to avoid similar incidents in the future.

The apology, however, was met with skepticism online. Many users argued that an automated system cannot take responsibility or express remorse, insisting that accountability lies with the developers and executives who designed, approved, and deployed the technology. Critics said the response appeared to shift blame away from human decision-makers.

Elon Musk and xAI’s leadership were also targeted by critics who said the apology should have come directly from the company’s management. According to these voices, responsibility for defining ethical standards, technical limits, and moderation policies rests with those overseeing the platform, not the software itself.

The incident has intensified calls for stricter oversight of AI tools. Experts and users alike warned that without deep technical changes and robust filters, similar failures could occur again. Some have even suggested suspending the tool until stronger guarantees are in place.

More broadly, the controversy has reignited debate over legal and ethical responsibility in artificial intelligence, particularly regarding the duty of AI creators to ensure their systems do not generate illegal or harmful content, especially when child protection is involved.


  • Fajr
  • Sunrise
  • Dhuhr
  • Asr
  • Maghrib
  • Isha

Read more

This website, walaw.press, uses cookies to provide you with a good browsing experience and to continuously improve our services. By continuing to browse this site, you agree to the use of these cookies.