The head of Telegram, Pavel Durov, has been charged by the French judiciary for allegedly allowing criminal activity on the messaging app but avoided jail with a €5m bail.
The Russian-born multi-billionaire, who has French citizenship, was granted release on condition that he report to a police station twice a week and remain in France, Paris prosecutor Laure Beccuau said in a statement.
The charges against Durov include complicity in the spread of sexual images of children and a litany of other alleged violations on the messaging app.
His surprise arrest has put a spotlight on the criminal liability of Telegram, the popular app with around 1 billion users, and has sparked debate over free speech and government censorship.
Now do Zuckerberg and Musk
do you agree with this arrest or are you pointing out the double standards ?
I don’t really understand how he allowed crime. I can commit crime via sms, whatsapp, signal or mail. Does that mean they allow it?
I think the distinction here is that if your phone provider, WhatsApp, Signal or mail carrier is informed that someone is engaging in illegal activity using their service, these entities would comply and give the information they have on you-- be it a lot like SMS or a little like Signal (phone number, registration date).
In the case of Telegram, they’ve been informed countless times that specific individuals are engaging in blatantly illegal activity and unlike the previously mentioned entities, Telegram is refusing to comply with any legal requests.
I believe that’s the situation but if I’m wrong, by all means correct me because this is a very interesting subject.
Thanks, this is the first explanation that’s actually clicked for me.
Don’t leave out Spez.
So to make it clear, it’s because their company actually holds some data for clients that these governments want access to - because telegram is not peer-to-peer, unless you set a chat to private.
In essence Telegram as a company holds a lot of data that the French authorities want access to…
This comment brought to you by the Signal gang.
Nobody is coming after Signal because nobody uses Signal. Telegram has a user base of almost 1 billion.
Telegram has no end to end encryption by default.
Wording is confusing. Here are some better takes that sound valid and are true:
-
Telegram’s e2ee is only available for chats of 2 people, and only on official mobile client.
-
Telegram’s e2ee is a feature you have to enable whenever you need it (called secret chats).
and only on official mobile client.
This is incorrect, it is also available in other mobile clients (at least those which are forks of the official one).
-
Telegrams moderation leaves a lot to be desired. I’m not saying they should look into or give governments people’s private conversations but I am saying that certain public features of telegram that do allow you to report illegal materials have been used to spread them.
certain public features of telegram that do allow you to report illegal materials have been used to spread them.
I don’t understand, what do you mean? Does clicking “report” on a message not simply send a report to moderators only?
I’m saying that Telegram’s moderators are not moderating stuff they should be moderating and that they have admitted they should be moderating. I know that it’s not their fault, it’s the small size of the team compared to almost a billion monthly active users, but still.
I know that it’s not their fault, it’s the small size of the team
This part is directly Telegram’s fault. If they cannot keep up with their moderation queue then they need a bigger moderation team. Preferably properly remunerated. There are news reports about how Facebook’s sub-contracted moderators work for these extremely shitty companies who track them based on how many reviews a minute they do, and which causes extreme psychological damage to the workers both because of the extreme content they have to see as part of their jobs and the bad working conditions they must put up with.
Yes, basically every corporate social media site needs more moderators. A single person can barely moderate 200K users (cohost), so a platform with 900 million should probably have a trust and safety team larger than 30 or 60 (Durov didn’t confirm it).