Breaking 12:00 Chinese stocks and bonds outperform global peers as Iran war reshapes energy markets 11:30 Saudi Arabia pushes back on Trump's war strategy as Gulf allies fear retaliation 11:20 An AMD-Intel acquisition hoax spreads on April Fools' Day 10:50 Gold retreats to $4,600 after its worst monthly drop since 2008 10:00 TSX futures decline as geopolitical tensions weigh on markets 09:35 Tim Cook unveils rare Apple prototypes on the company's 50th anniversary 09:20 A record $977M bet against oil backfires as crude prices surge 09:15 Lithuania seeks US cooperation in trafficking probe linked to Epstein case 08:50 Artemis II toilet malfunctions hours into the historic lunar mission 08:20 NASA Artemis II crew reaches Earth orbit in historic first step toward the Moon 07:50 AI models lie and defy orders to prevent other AIs from being deleted, study finds 17:40 France closely watches Pernod Ricard and Brown Forman merger talks 17:30 Spacex files confidential ipo plan targeting record $75 billion raise 17:16 Lufthansa plans to ground 40 aircraft as Iran war doubles jet fuel costs 16:45 Iranian strikes on Gulf aluminium plants push prices to four-year highs 16:20 Russia earns $9 billion a month in oil windfall from the Iran war 16:04 Lilly’s weight-loss pill receives US FDA approval 16:00 Oil falls toward $100 as Trump claims Iran requested a ceasefire 15:40 Intel buys back Apollo's stake in Irish chip plant for $14.2 billion 15:38 Mega IPO wave builds as SpaceX moves closer to public listing 15:26 Switzerland considers cancelling U.S. Patriot missile deal amid uncertainty 14:50 New studies reveal how DNA movement and cell mechanics drive cancer development 14:20 Artemis II crew prepares for liftoff on first crewed lunar flight since Apollo 17 14:05 Canadian manufacturing slows as global tensions weigh on outlook 12:45 NASA set to launch its first crewed lunar mission since 1972

Grok issues apology after serious content moderation failure

Saturday 03 January 2026 - 15:00
By: Sahili Aya
Grok issues apology after serious content moderation failure

Grok, the artificial intelligence tool developed by Elon Musk’s company xAI, has issued a public apology following a major controversy linked to the circulation of illegal content generated by its image system. The incident sparked widespread criticism and renewed concerns over the safeguards used in generative AI technologies.

In a statement released on its platform, Grok acknowledged serious shortcomings in its safety mechanisms and admitted that existing protections had failed to prevent the creation of prohibited material. The company promised a review of its internal procedures and pledged to strengthen moderation systems to avoid similar incidents in the future.

The apology, however, was met with skepticism online. Many users argued that an automated system cannot take responsibility or express remorse, insisting that accountability lies with the developers and executives who designed, approved, and deployed the technology. Critics said the response appeared to shift blame away from human decision-makers.

Elon Musk and xAI’s leadership were also targeted by critics who said the apology should have come directly from the company’s management. According to these voices, responsibility for defining ethical standards, technical limits, and moderation policies rests with those overseeing the platform, not the software itself.

The incident has intensified calls for stricter oversight of AI tools. Experts and users alike warned that without deep technical changes and robust filters, similar failures could occur again. Some have even suggested suspending the tool until stronger guarantees are in place.

More broadly, the controversy has reignited debate over legal and ethical responsibility in artificial intelligence, particularly regarding the duty of AI creators to ensure their systems do not generate illegal or harmful content, especially when child protection is involved.


  • Fajr
  • Sunrise
  • Dhuhr
  • Asr
  • Maghrib
  • Isha

Read more

This website, walaw.press, uses cookies to provide you with a good browsing experience and to continuously improve our services. By continuing to browse this site, you agree to the use of these cookies.