X

Telegram's Data Sharing: A Shift in Privacy Policy Raises Concerns

Telegram's Data Sharing: A Shift in Privacy Policy Raises Concerns
Tuesday 24 September 2024 - 12:16
Zoom

In a move that has sent shockwaves through the digital world, Telegram, the popular messaging app, has announced a significant change to its data-sharing practices. The company, known for its emphasis on user privacy, has revealed that it will now provide user IP addresses and phone numbers to authorities upon request, a decision that has sparked debate and raised questions among its user base.

Telegram's CEO, Pavel Durov, justified this shift by stating that it would help deter criminal activities. He emphasized the need to protect the vast majority of law-abiding users, highlighting that a small fraction engaged in illicit activities could jeopardize the platform's reputation and the interests of its nearly one billion users.

This announcement marks a notable departure from Telegram's previous stance, especially considering Durov's recent detention by French authorities. The co-founder, born in Russia, was charged with enabling criminal activity on the platform, including the spread of child abuse images and drug trafficking. He vehemently denied these charges, criticizing the notion of being held responsible for third-party crimes on Telegram.

Critics have long argued that Telegram has become a haven for misinformation, child pornography, and terror-related content. The platform's group feature, allowing up to 200,000 members, has been cited as a contributing factor. In contrast, Meta's WhatsApp limits group sizes to 1,000.

Telegram has faced scrutiny for hosting far-right channels linked to violence in English cities and has recently been banned in Ukraine on state-issued devices due to concerns about Russian threats.

The arrest of Pavel Durov has ignited a broader discussion on free speech protections online. Many users, especially political dissidents, had viewed Telegram as a safe haven, attracted by its resistance to government demands. However, this latest policy change has caused alarm, leading to questions about Telegram's future cooperation with repressive regimes.

John Scott-Railton, a senior researcher at the University of Toronto's Citizen Lab, noted that many communities are now scrutinizing Telegram's announcement with skepticism. He highlighted the lack of clarity from Telegram on how it will handle demands from such regimes in the future.

Cybersecurity experts have pointed out that Telegram's moderation system for extremist and illegal content is weaker than that of competing social media platforms and messenger apps. Prior to this policy expansion, Telegram only supplied information on terror suspects.

On the day of the announcement, Durov revealed that the app now employs a dedicated team of moderators using artificial intelligence to conceal problematic content in search results. However, experts like Daphne Keller from Stanford University's Center for Internet and Society argue that this may not be sufficient to meet legal requirements in France and Europe.

Keller emphasized the need for Telegram to remove any content that its employees can reasonably identify as illegal, and in certain countries, the company is also required to notify authorities about specific types of severely illegal content, such as child sexual abuse material.

The changes implemented by Telegram have left many questioning whether they will satisfy the information needs of law enforcement authorities, particularly regarding the identification of investigation targets and the content of their communications.


Read more