Breaking 11:20 China expands seabed mapping operations with military implications 11:00 Chinese battery giants gain $70 billion as oil shock boosts EV demand 10:40 Apple sets WWDC 2026 for June with preview of iOS 27 10:20 Russia gold reserves fall to four year low amid budget strain 10:00 Soviet submarine leak and Pacific nuclear dome raise contamination concerns 09:40 Japan signals currency intervention as yen nears 160 per dollar 09:20 AWS Bahrain cloud region disrupted again by drone activity 08:50 Global energy crisis deepens as Hormuz disruption enters fourth week 07:50 Oil prices swing as US Iran signals clash over talks 17:50 Dogecoin longs surge raises risk of liquidation cascade 17:00 TotalEnergies shifts $1 billion from offshore wind to U.S. oil and gas 16:20 Lyme disease vaccine shows over 70 percent efficacy in phase 3 trial 16:18 XBOW secures $120 million and integrates AI pentesting with Microsoft 15:54 Kandou AI raises $225 million to scale AI chip infrastructure 15:50 Morgan Stanley upgrades US LNG exporters after Qatar supply disruption 15:45 Grab expands beyond Southeast Asia with $600 million Taiwan deal 14:50 Astronomers map 12 billion years of a spiral galaxy’s evolution 14:30 DoorDash launches emergency support program as fuel prices surge for gig workers 14:20 US agencies ordered to patch iPhone flaws linked to DarkSword spyware 14:00 Italy seeks new gas supplies as Meloni visits Algeria amid Qatar disruptions 13:50 Crypto fear index plunges to extreme lows amid market selloff 13:20 Nikkei plunges nearly 5 percent as Middle East tensions rattle Asia 12:50 Dollar surges as Trump ultimatum to Iran nears deadline 12:20 Goldman Sachs raises oil forecasts as Hormuz crisis disrupts supply 11:50 KPMG introduces AI kill switches amid rising autonomous agent risks

Grok issues apology after serious content moderation failure

Saturday 03 January 2026 - 15:00
By: Sahili Aya
Grok issues apology after serious content moderation failure

Grok, the artificial intelligence tool developed by Elon Musk’s company xAI, has issued a public apology following a major controversy linked to the circulation of illegal content generated by its image system. The incident sparked widespread criticism and renewed concerns over the safeguards used in generative AI technologies.

In a statement released on its platform, Grok acknowledged serious shortcomings in its safety mechanisms and admitted that existing protections had failed to prevent the creation of prohibited material. The company promised a review of its internal procedures and pledged to strengthen moderation systems to avoid similar incidents in the future.

The apology, however, was met with skepticism online. Many users argued that an automated system cannot take responsibility or express remorse, insisting that accountability lies with the developers and executives who designed, approved, and deployed the technology. Critics said the response appeared to shift blame away from human decision-makers.

Elon Musk and xAI’s leadership were also targeted by critics who said the apology should have come directly from the company’s management. According to these voices, responsibility for defining ethical standards, technical limits, and moderation policies rests with those overseeing the platform, not the software itself.

The incident has intensified calls for stricter oversight of AI tools. Experts and users alike warned that without deep technical changes and robust filters, similar failures could occur again. Some have even suggested suspending the tool until stronger guarantees are in place.

More broadly, the controversy has reignited debate over legal and ethical responsibility in artificial intelligence, particularly regarding the duty of AI creators to ensure their systems do not generate illegal or harmful content, especially when child protection is involved.


  • Fajr
  • Sunrise
  • Dhuhr
  • Asr
  • Maghrib
  • Isha

Read more

This website, walaw.press, uses cookies to provide you with a good browsing experience and to continuously improve our services. By continuing to browse this site, you agree to the use of these cookies.