Broadcast United

95% of self-harm posts are not flagged by major platforms

Broadcast United News Desk
95% of self-harm posts are not flagged by major platforms

[ad_1]

New research published by the UK’s Molly Rose Foundation (MRF) shows that major social networks are failing to detect and remove dangerous suicide and self-harm content, putting children at risk.

An analysis of more than 12 million content moderation decisions from six major tech platforms showed that just two major platforms, Pinterest and TikTok, detected more than 95% of suicide and self-harm posts.

The report found that Instagram and Facebook were each responsible for just 1% of all suicide and self-harm content detected on the platforms. Twitter was responsible for just 1 in 700 content decisions.

Chairman Ian Russell urged the UK government to commit to a new Cybersecurity Bill to strengthen regulation and “finish the job”.

The MRF analysed public records of more than 12 million content moderation decisions made by six sites: Facebook, Instagram, Pinterest, Snapchat, TikTok and X. Under the EU’s Digital Services Directive, these platforms must publish records every time they detect suicide and self-harm content and take action.

Social media platforms often fail to detect harmful content in the riskiest parts of their services. For example, even though Instagram’s short-form video product Reels accounts for half of all time spent on the app, only 1 in 50 suicide and self-harm posts detected by Instagram were videos.

Your own rules

The report said most major services did not do enough to enforce their own rules: for example, while TikTok detected nearly 3 million pieces of suicide and self-harm content (2,892,658 decisions), it only suspended two accounts.

The foundation also found no evidence that Meta is following through on its high-profile commitment to restrict harmful suicide and self-harm content from being pushed to children. Despite promising to restrict harmful content in January 2024 shortly before Mark Zuckerberg gave evidence to a US Senate hearing, Meta has not restricted any content aimed at teens to date.

Ian Russell, president of the Molly Rose Foundation, said: “It is shocking that most big tech companies continue to sit on the sidelines and choose not to take action to save young lives.”

[ad_2]

Source link

Share This Article
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *