Breaking 17:20 Gold and silver tumble as dollar surges on Iran war and inflation fears 16:50 Fuel price surge disrupts Easter and spring travel worldwide 16:40 UBS holds $5,600 gold target and calls 17% pullback a buying opportunity 16:20 Rescue operation underway as debris in Iran identified as U.S. F-15E 15:50 AI-powered cyberattacks reach a "pivotal moment," experts warn 15:20 Wedbush holds $600 Tesla target despite disappointing Q1 deliveries 14:50 China's Tianlong-3 rocket fails on maiden flight 13:50 Analysts warn Iran could become a North Korea-style garrison state 13:20 Tether gives investors two weeks to commit to $500 billion valuation round 13:10 Coinbase commits $150 million to protect Bitcoin from quantum computing threats 12:50 Erste Group cuts Toyota to "hold" on tariff drag and slowing US sales 12:40 Arm shares slide as investors lock in gains after AI-driven rally 11:50 Iranian drones strike Kuwait's Mina al-Ahmadi refinery again, sparking fires 11:50 Morocco launches sovereign AI platform to boost industrial transformation 11:35 Tesla posts record sales in South Korea and Australia as oil crisis accelerates EV shift 11:20 European Q1 earnings set to rise 4% as energy sector surges 10:50 Japan warns speculators as yen nears 160 amid escalating Iran conflict 10:20 Anthropic maps 171 emotion-like patterns inside Claude that shape its behavior 09:50 Container ship Safeen Prestige sinks in Strait of Hormuz after weeks ablaze 09:20 Airlines face bankruptcy risk as fuel costs soar amid Middle East conflict 08:50 Fossils in China push back origins of complex animal life by millions of years 08:20 Artemis II crew captures iPhone footage in zero gravity on lunar journey 07:50 JWST finds "forbidden" exoplanet with unexpectedly metal-poor atmosphere 17:50 Global fuel prices surge as Iran conflict disrupts oil markets

Grok issues apology after serious content moderation failure

Saturday 03 January 2026 - 15:00
By: Sahili Aya
Grok issues apology after serious content moderation failure

Grok, the artificial intelligence tool developed by Elon Musk’s company xAI, has issued a public apology following a major controversy linked to the circulation of illegal content generated by its image system. The incident sparked widespread criticism and renewed concerns over the safeguards used in generative AI technologies.

In a statement released on its platform, Grok acknowledged serious shortcomings in its safety mechanisms and admitted that existing protections had failed to prevent the creation of prohibited material. The company promised a review of its internal procedures and pledged to strengthen moderation systems to avoid similar incidents in the future.

The apology, however, was met with skepticism online. Many users argued that an automated system cannot take responsibility or express remorse, insisting that accountability lies with the developers and executives who designed, approved, and deployed the technology. Critics said the response appeared to shift blame away from human decision-makers.

Elon Musk and xAI’s leadership were also targeted by critics who said the apology should have come directly from the company’s management. According to these voices, responsibility for defining ethical standards, technical limits, and moderation policies rests with those overseeing the platform, not the software itself.

The incident has intensified calls for stricter oversight of AI tools. Experts and users alike warned that without deep technical changes and robust filters, similar failures could occur again. Some have even suggested suspending the tool until stronger guarantees are in place.

More broadly, the controversy has reignited debate over legal and ethical responsibility in artificial intelligence, particularly regarding the duty of AI creators to ensure their systems do not generate illegal or harmful content, especially when child protection is involved.


  • Fajr
  • Sunrise
  • Dhuhr
  • Asr
  • Maghrib
  • Isha

Read more

This website, walaw.press, uses cookies to provide you with a good browsing experience and to continuously improve our services. By continuing to browse this site, you agree to the use of these cookies.