Telegram founder and CEO Pavel Durov was arrested in France in connection with an investigation into the messaging app’s moderation (or lack thereof).
Durov was arrested at Paris-Le Bourget airport on Saturday evening (local time). Reuters reports that he had just arrived on a private jet from Azerbaijan. AFP sources said a warrant for Durov’s arrest had been issued by OFMIN, a French law enforcement agency that focuses on combating violence against minors.
Encryption backdoors violate human rights, EU court rules
Telegram is reportedly under investigation for failing to curb criminal activity on its platform due to its lenient moderation policies, and is accused of failing to cooperate with French authorities investigating criminal activity that may include fraud, drug trafficking, child sexual abuse, promoting terrorism, organized crime, and cyberbullying.
The messaging app issued a statement defending Durov, saying it had not broken any laws and was currently awaiting a “speedy resolution to the situation.”
“Telegram complies with EU law, including the Digital Services Act, and the company’s moderation is within industry standards and is constantly being improved,” Telegram said in a post on X. “Telegram CEO Pavel Durov has nothing to hide and travels frequently to Europe. It is absurd to claim that the platform or its owners are responsible for misuse of its platform.”
The tweet may have been deleted
French authorities can detain suspects for up to 96 hours, after which they must either release them or charge them. GuardianDurov’s detention has already been extended beyond Sunday night by the investigating judge.
Durov was born in Russia but holds multiple nationalities, including French, which appears to have caused conflict between the two countries, as Russian authorities say France has denied Durov access to its consulate.
Mashable Lightspeed
Some Russian officials have further blamed French censors. Russian Human Rights Commissioner Tatyana Moskalkova has reportedly claimed that the real motive for Durov’s arrest was “an attempt to shut down Telegram as an Internet resource where you can find out the truth about world events.” The messaging app is a popular tool for Russian authorities.
Russia had a feud with Telegram in 2017, when the company refused to decrypt communications from six users suspected of “terrorist-related activity.” In response, Russia fined Telegram and blocked its use in the country. The ban was upheld by a Russian court and eventually lifted in 2020.
Social media users have begun calling for his release using the hashtag #FreePavel, which billionaire Elon Musk used to share a video of Durov praising his platform, X, for becoming “more pro-free speech.” Whistleblower Edward Snowden also condemned Durov’s arrest, calling it “a violation of the fundamental human rights of speech and association.”
Telegram’s lax moderation approach attracts criminals
Telegram has built a reputation as a privacy-focused messaging app, offering end-to-end encryption and vowing to “protect user data at all costs.” Unfortunately, this policy has allowed misinformation, disinformation, and criminal activity to thrive on the app, giving rise to data leaks, revenge porn, forged documents, and Nazi extremism.
The company is well aware that its service is being used for criminal purposes. Despite this, Telegram seems to have little interest in preventing such activity. The company specifically addresses the issue on its FAQ page, answering the question, “I have illegal content on Telegram. How can I remove it?”
“All chats and group chats on Telegram are private between participants,” Telegram wrote. “We do not process any requests related to them.”
That’s not to say Telegram hasn’t dabbled in moderation: Shortly after the Jan. 6 attack on the US Capitol, the app blocked “dozens” of channels for inciting violence. A Telegram spokesperson told CNN at the time that the company “routinely removes public content that contains direct calls to violence.”
Yet Telegram has a much longer and more consistent history of taking a laissez-faire approach to content moderation.
The dichotomy between law enforcement and privacy rights doesn’t have a clear solution. Governments around the world have tried to force tech companies to build backdoors into their encryption, arguing that law enforcement needs access to users’ chat logs. But privacy advocates argue that the technology just doesn’t work that way, making it impossible to bypass encryption without weakening security for everyone.