Breaking 13:31 Up to 16 percent of plant species face extinction by 2100, UC Davis study warns 12:15 Logitech CEO plans to boost spending on R&D and marketing 11:51 Leaked audio recordings allege US-backed plot to destabilize Latin America's left-wing governments 11:40 US and Iran trade fire in the Strait of Hormuz in most serious clash since ceasefire 11:30 US hybrid car sales soar as fuel prices rise 10:30 Mattel investor calls for strategic review as toy demand weakens 10:20 Search for two missing US soldiers in Morocco enters fifth day with 600 personnel deployed 10:15 Vatican’s careful language on Pope-Rubio meeting signals strained relations with Trump administration 09:30 Marco Rubio meets Giorgia Meloni amid tensions between Rome and Washington 09:00 Zyphra's sub-billion parameter AI model matches industry giants on reasoning benchmarks 08:37 Iran threatens UAE will "pay the price" after explosions rock Qeshm island 08:15 US investigates alleged smuggling of Nvidia AI Chips through Thailand 07:59 Trump sets July 4 deadline for EU to ratify trade deal or face higher tariffs 07:03 Microsoft scales back Copilot as the company retreats from its AI-everywhere strategy 17:00 Rave files antitrust lawsuit against Apple over App Store removal 16:45 BlackRock reduces private credit fund valuation by 5% in first quarter 16:20 Nvidia's Jensen Huang calls AI job loss warnings ridiculous and attacks rivals' God complex 16:15 United States sanctions Iraqi oil official and militias over alleged Iran ties 15:56 European climate model puts odds of a super El Niño by November at 100 percent 15:45 Whirlpool shares plunge after weak revenue and dividend suspension 15:23 Rubio visits Rome to ease Trump's rift with the Vatican and Italy

Grok issues apology after serious content moderation failure

Saturday 03 January 2026 - 15:00
By: Sahili Aya
Grok issues apology after serious content moderation failure

Grok, the artificial intelligence tool developed by Elon Musk’s company xAI, has issued a public apology following a major controversy linked to the circulation of illegal content generated by its image system. The incident sparked widespread criticism and renewed concerns over the safeguards used in generative AI technologies.

In a statement released on its platform, Grok acknowledged serious shortcomings in its safety mechanisms and admitted that existing protections had failed to prevent the creation of prohibited material. The company promised a review of its internal procedures and pledged to strengthen moderation systems to avoid similar incidents in the future.

The apology, however, was met with skepticism online. Many users argued that an automated system cannot take responsibility or express remorse, insisting that accountability lies with the developers and executives who designed, approved, and deployed the technology. Critics said the response appeared to shift blame away from human decision-makers.

Elon Musk and xAI’s leadership were also targeted by critics who said the apology should have come directly from the company’s management. According to these voices, responsibility for defining ethical standards, technical limits, and moderation policies rests with those overseeing the platform, not the software itself.

The incident has intensified calls for stricter oversight of AI tools. Experts and users alike warned that without deep technical changes and robust filters, similar failures could occur again. Some have even suggested suspending the tool until stronger guarantees are in place.

More broadly, the controversy has reignited debate over legal and ethical responsibility in artificial intelligence, particularly regarding the duty of AI creators to ensure their systems do not generate illegal or harmful content, especially when child protection is involved.


  • Fajr
  • Sunrise
  • Dhuhr
  • Asr
  • Maghrib
  • Isha

Read more

This website, walaw.press, uses cookies to provide you with a good browsing experience and to continuously improve our services. By continuing to browse this site, you agree to the use of these cookies.