Broadcast United

Information about Telegram CEO Pavel Durov’s unexpected detention in France

Broadcast United News Desk
Information about Telegram CEO Pavel Durov’s unexpected detention in France

[ad_1]

Pavel Durov, the CEO and founder of messaging app Telegram, was detained in Paris on Saturday as part of an ongoing financial and cybercrime investigation in France. On Monday, French officials said he remained in custody, though he has not been charged with any crime.

French President Macron denies The arrests were politically motivatedDurov holds French and Emirati citizenship, but his ancestral home is in Russia; France has been strongly critical of Russia’s Invasion of Ukraine And imposed sanctions on its economy.

The exact reason for Durov’s arrest is still unknown, but according to French prosecutors, Durov’s arrest is part of a larger investigation in France. The New York Times Prosecutors reportedly said they were investigating an “unnamed individual” who they believed may have committed the crime. Exhaustive list of crimes — apparently with the help of Telegram — that included the distribution of child sexual abuse material, money laundering and drug trafficking. The Washington Post French police said “child sex crimes” were an area of ​​particular concern to officials, the report said.

It is not clear whether Durov hasAnonymousUnless formally charged, Durov can only be detained until Wednesday.

This is not the first time Telegram has been implicated in illegal activity. Global popular platform It offers both broadcast channels (where users can send text and media to large groups of people) and user-to-user chat. It also offers what it calls “Secret Chat“End-to-end encrypted conversations — meaning that messages sent can only be deciphered by the conversation participants, and no one else (not even Telegram) can see the content.

This feature and Other privacy features Features such as automatic message deletion make the app useful for political dissidents and journalists trying to work under repressive regimes or protect their sources. But over the years, the app has also become a target for extremists to incite users and Organize terrorist attacks.

This led to Some pressure from the government Telegram has been asked to be more cooperative in sharing data with authorities. Despite this, Telegram has largely avoided dramatic legal run-ins — until now.

Durov’s arrest has renewed scrutiny of the app and reignited debate about free speech and the challenges of content moderation on social media.

Telegram and the problem of content moderation

Durov and his brother Nikolay founded Telegram to offer an app centered around user privacy in response to Russia’s “Snow RevolutionIn 2011 and 2012, blatant electoral fraud sparked months of protests that culminated in a harsh and evolving government crackdown. Durov has previously clashed with Russian authorities who want to suppress speech on the Facebook-like service he works on. Founded as VKontakte.

Since its inception, Telegram has been implicated in some egregious crimes. Perhaps most notoriously, it was used to Coordinating ISIS attacks in Paris and BerlinIt cracked down on ISIS-based activity on the app following those attacks, but its content moderation policies have come under scrutiny.

As Vox points outThese policies are more lenient than those of other social media groups, and The Washington Post There have been reports of all kinds of criminal content, including child pornography, on Telegram. Alessandro Accorsi, a researcher at the International Crisis Group, told Vox that preventing such content from appearing on the platform is a difficult task, but not impossible.

“The effectiveness of content moderation depends heavily on the platforms and the resources they allocate for safety,” Accorsi said. “Social media companies are typically reactive. They want to limit the financial resources they can devote to moderation, as well as the possible legal, political, and ethical issues. So it’s often the case that they focus their efforts on a small number of groups or issues, and their inaction would bring them legal or reputational damage.”

For example, when ISIS uses a service to conduct terrorist attacks, the service’s focus is to prevent ISIS from using its products.

In communications without end-to-end encryption, tech companies use a combination of Human investigator and algorithm-driven program However, the end-to-end encryption used in Telegram’s “secret chats” makes such auditing nearly impossible.

Another complication is the diversity of Internet laws around the world. In the United States, publishers generally Legal immunity from liability control over what users post. But this is not universally the case; many countries A stricter legal framework around intermediary liability. French SREN Act Very strict, fines can be imposed on publishers whose content violates the rules.

“That’s really hard to do, especially in a comparative context, because hateful, extreme or radical speech in places like the United States is different than in Myanmar or Bangladesh or other countries.” David MuchlinskiThis makes content moderation “a clumsy tool at best,” a Georgia Tech professor of international affairs told Vox.

Telegram has responded to recent external pressureAccorsi told Vox that the company employs some content moderation measures. The company has banned channels associated with a small number of organizations (most recently Hamas and British far-right groups), but thousands of problematic organizations remain.

The French investigation suggests that Telegram may not have done enough to prevent bad actors from using the platform to commit crimes.



[ad_2]

Source link

Share This Article
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *