• Show Notes

Dear Reader,

It’s tempting to think of all social media platforms as the same. And they do have a lot of commonalities: They are pretty much all run by emotionally stunted billionaire tech bros; they are hotbeds of disinformation that impact almost every aspect of our lives; and they are so addictive we can’t seem to quit them no matter how bad they get. It’s easy, then, to assume that the arrest of Telegram founder and CEO Pavel Durov last week in Paris might be the start of a slippery slope against free speech on social media platforms more generally. But a closer look at both the unique features of Telegram combined with the particular kinds of speech at issue in Pavel’s arrest suggests that it’s unlikely that other platforms will be targeted in the same way – and perhaps that it wouldn’t necessarily be a bad thing if they were.

It’s likely that you’re not super familiar with Telegram, since it is much more popular outside of the United States than it is here. First, its origins: Telegram was founded in Russia by Durov and and his brother Nikolai in 2013. The pair had previously owned Vkontakte, a Russian social media platform similar to Facebook, but shut it down in 2014 after a run-in with Russian authorities who wanted data on users who had been involved in the pro-democracy protests in Ukraine in 2013. The brothers based Telegram in Dubai, introducing a social media platform that combines messaging and social media features in novel ways.

Three main features make Telegram, which currently has almost a billion users, different from other social media platforms. The first is that like other messaging apps, Telegram allows for group chats – except that it can accommodate “groups” of up to 200,000 people (by comparison, WhatsApp limits groups to 1024 users). Telegram also offers public facing “channels” which can have an unlimited number of subscribers; the channel’s administrators can push whatever content they want on it. Unlike in other social media apps, however, subscribers cannot comment or otherwise add their own input on content published through Telegram’s channels. Finally, users can opt-in to “secret chats” which provide end-to-end encryption, similar to messaging apps like Signal – the content on these chats cannot be accessed even by Telegram itself.

These features, combined with the owners’ prior refusal to kowtow to Russian officials’ demands for data from Vkontakte, has made Telegram the champion of “cyber libertarians” worldwide. And the democratizing features of the platform should not be discounted. Telegram’s privacy makes it a preferred app for protesters and activists worldwide: It has played a critical role in the 2019 protests in Hong Kong, the 2020 demonstrations in Belarus, and the 2017 and 2019 protests in Iran. Ukrainian President Volodymyr Zelensky has his own hugely popular Telegram channel which allows him to effectively counter the Russian propaganda rampant on the site, and is one of the few places that Russians can get independent sources of information about the war.

So. Speaking of propaganda. As they say, one man’s freedom fighter is another man’s terrorist, so the same features that make Telegram a safe haven for people pushing back against oppressive, authoritarian regimes also make it a great hideout for some bad hombres. For example, Telegram is the “app of choice” for ISIS to disseminate its own propaganda and recruit new members: It was used to recruit the perpetrators behind the ISIS attacks in Berlin and Istanbul. Telegram has also become the safe haven for Twitter and Facebook users connected to the January 6 insurrection, who were kicked off of those platforms following the attack on the Capitol. As a result, Telegram saw an increase of 25 million users after their competitors’ excommunication.

But the reason Durov has run afoul of French law isn’t because of propaganda or extremist content. Rather, French authorities allege that Durov is facilitating pretty straightforward criminal activity, like child sexual exploitation and drug trafficking. The platform is also rife with fraud, because users’ handles do not need even to be linked to a telephone number, eliminating the most basic traceable identifiers or verification and allowing individuals to masquerade as trusted authorities or companies. This is the result of the toxic combination of the almost complete lack of content moderation on Telegram and the platform’s refusal to cooperate with law enforcement under any circumstances. And because Telegram allows for unlimited forwarding of content – both in its massive groups and on its channels – criminal and exploitative content can spread rapidly at scale.

In this regard, Telegram seems to be even more dangerous than Twitter/X and Facebook (you might not have thought that was possible), since the latter platforms do, in general, try to remove criminal content and comply with authorized law enforcement investigations, especially when it comes to activity like terrorism and child sexual exploitation. And in any case, the slippery slope isn’t likely to happen in the U.S. any time soon: The owners of these platforms are protected from civil liability from unmoderated content on their products by Section 230 of the Communications Decency Act of 2006 – a buffer that almost certainly suggests they are insulated from direct criminal liability as well for taking a laissez faire approach like Telegram. By contrast, European countries, including France, don’t provide such protections, allowing owners like Durov to be held criminally liable for “complicity in managing an online platform to allow illicit transactions by an organized group.” I mean, that sounds pretty crimey to me.

The arrest of Durov isn’t really a free speech issue as some have argued, but rather a question of what kinds of accountability are appropriate for individuals who enable a vast criminal underworld that impacts some of the most vulnerable people on the planet. Rather than looking at Durov’s arrest as a warning, perhaps we should see it as an example.

Stay Informed,

Asha