X

Telegram Faces Criticism Over Handling of Child Exploitation Content

Telegram Faces Criticism Over Handling of Child Exploitation Content
Wednesday 28 August 2024 - 14:25
Zoom

Before Telegram CEO Pavel Durov's arrest in France, the messaging app had been criticized for its lack of response to child safety advocates. Advocacy groups such as the National Center for Missing & Exploited Children (NCMEC), the Canadian Centre for Child Protection, and the Internet Watch Foundation have reported that their efforts to alert Telegram about child sexual abuse material (CSAM) on the platform have largely gone unanswered.

Durov, who co-founded Telegram, was taken into custody by French authorities over the weekend. The Paris prosecutor has yet to detail specific charges but indicated that Durov’s arrest relates to an ongoing investigation involving serious allegations, including "complicity" in illegal transactions and the distribution of CSAM.

Telegram, which is popular in former Soviet countries and among certain U.S. far-right groups, claimed in a statement on X (formerly Twitter) that it adheres to European Union laws and that Durov has “nothing to hide.” The platform argued that it is “absurd” to suggest that the platform or its owner are responsible for the abuse occurring on it.

Despite its claims, Telegram has been criticized for its approach to moderation. It has a reputation for being uncooperative with law enforcement, with Durov stating in April that the platform had 900 million active users. John Shehan, senior vice president at NCMEC, expressed optimism about Durov’s arrest, highlighting Telegram’s notorious lack of content moderation regarding CSAM.

“Telegram stands out for its inadequate response to child sexual exploitation,” Shehan said. “It’s encouraging to see the French authorities taking action to address this issue.”

Telegram’s policy states that it does not act on reports of illegal activity in private or group chats and has not disclosed user data to third parties, including governments. This stance contrasts sharply with other major tech companies that routinely comply with legal requests for user data.

In a statement, Telegram spokesperson Remi Vaughan did not address the specific claims about ignored reports but stated that the platform actively moderates harmful content. Vaughan emphasized that Telegram uses a combination of AI tools, proactive monitoring, and user reports to manage content and claims to ban thousands of public groups daily for violating its terms of service.

The Stanford Internet Observatory's report on CSAM enforcement noted that Telegram’s privacy policy uniquely fails to explicitly prohibit CSAM or child grooming in its private chats. Unlike U.S.-based platforms, which are legally required to work with NCMEC to quickly remove flagged abuse material, Telegram operates under different regulations in Dubai.

Other tech companies, including TikTok, Fenix, and Aylo, promptly remove CSAM flagged by NCMEC, but Telegram’s policy differs significantly. While Telegram offers end-to-end encryption for private messages, it lacks reporting options for illegal content, a feature available on platforms like WhatsApp.

NCMEC has logged over 570,000 reports of CSAM on Telegram since the app’s inception in 2013. Despite ongoing outreach, the organization reports that Telegram remains largely unresponsive to their efforts. The Internet Watch Foundation and the Canadian Centre for Child Protection also report that Telegram has refused to implement measures to combat the spread of CSAM.

Stephen Sauer from the Canadian Centre for Child Protection noted an increase in CSAM on Telegram and criticized the platform’s opaque moderation practices. “Telegram’s approach is inadequate and lacks transparency,” Sauer said. “The platform’s inaction on this issue represents a deliberate choice to ignore the problem.”

As the situation unfolds, the response from Telegram and further actions by international authorities will be closely monitored.


Read more