The messaging platform Telegram has announced that it will provide users' IP addresses and phone numbers to authorities who present search warrants or other valid legal requests.
The modification to its terms of service and privacy policy "should discourage criminals," said CEO Pavel Durov in a Telegram post on Monday.
He continued, “While 99.999% of Telegram users have nothing to do with crime, the 0.001% involved in illicit activities create a bad image for the entire platform, putting the interests of our almost billion users at risk.”
This decision represents a major shift for Mr. Durov, the Russian-born co-founder of Telegram, who was detained by French authorities last month at an airport near Paris.
A few days after, prosecutors charged him with facilitating criminal activity on the platform.
The accusations against him include aiding in the distribution of child abuse images and drug trafficking. He was also charged with not complying with law enforcement.
Mr. Durov, who has denied the allegations, criticized the authorities after his arrest, stating that holding him accountable for third-party crimes on the platform was "surprising" and "misguided."
Critics argue that Telegram has become a hub for misinformation, child pornography, and terrorist-related content, in part due to a feature allowing groups to host up to 200,000 members.
In contrast, Meta-owned WhatsApp restricts group sizes to 1,000.
Earlier this week, Ukraine banned the app on government-issued devices in an effort to reduce security threats from Russia.
The arrest of the 39-year-old CEO has ignited debate about the future of free speech protections on the internet.
Following Mr. Durov’s detention, many began to question whether Telegram was a safe platform for political dissidents, according to John Scott-Railton, a senior researcher at the University of Toronto's Citizen Lab.
He notes that this latest policy shift is causing even greater concern in several communities.
Cybersecurity specialists point out that while Telegram has removed certain groups in the past, its system for moderating extremist and illegal content is significantly weaker than that of other social media companies and messaging apps.
Before this policy update, Telegram only provided information on terror suspects, as reported by 404 Media.
On Monday, Mr. Durov stated that the app now has “a dedicated team of moderators” utilizing artificial intelligence to hide problematic content from search results.
However, making such content harder to access may not be sufficient to meet French or European legal requirements, said Daphne Keller of Stanford University's Center for Internet and Society.
In some countries, platforms are also required to report specific types of illegal content, such as child sexual abuse material, she explained.
Ms. Keller raised doubts about whether the company's recent changes would satisfy authorities seeking details about the individuals under investigation, including who they are communicating with and the contents of their messages.
COMMENTS
Comments are moderated and generally will be posted if they are on-topic and not abusive.
For more information, please see our Comments FAQ