Back to top
Back to all articlesBack to all articles

NSPCC Slams Instagram for Failing to Remove Harmful Content Online

harmful-content-2

The NSPCC has criticised Instagram for failing to remove harmful content on their platform.

The children’s charity said that a fall in the amount of harmful content that was removed from the social media app was a ‘significant failure in corporate responsibility’.

Facebook, who owns the photo-sharing mobile app, released data that shows that Instagram removed almost 80% less graphic content between April and June 2020 than it had in the previous quarter.

The drop in the removal of content about self-harm and suicide is blamed on Covid restrictions which meant that most of the app’s content moderators were housebound.

As moderators returned to work once the restrictions were lifted, the number of removals returned to pre-Covid levels.

The UK government has promised an Online Harms bill that would create a new regulator and hold social media companies to account for the content on their platforms, however it has been severely delayed and will not be introduced until next year.

Tara Hopkins, Instagram's head of public policy, said: "We want to do everything we can to keep people safe on Instagram and we can report that from July to September we took action on 1.3m pieces of suicide and self-harm content, over 95% of which we found proactively.

"We've been clear about the impact of Covid-19 on our content-review capacity, so we're encouraged that these latest numbers show we're now taking action on even more content, thanks to improvements in our technology.

"We're continuing to work with experts to improve our policies and we are in discussions with regulators and governments about how we can bring full use of our technology to the UK and EU so we can proactively find and remove more harmful suicide and self-harm posts.”

However, the NSPCC claims that the drop in the number of takedowns had ‘exposed young users to even greater risk of avoidable harm during the pandemic’.

The charity's head of child safety online policy, Andy Burrows, said: "Sadly, young people who needed protection from damaging content were let down by Instagram's steep reduction in takedowns of harmful suicide and self-harm posts.

"Although Instagram's performance is returning to pre-pandemic levels, young people continue to be exposed to unacceptable levels of harm.

"The government has a chance to fix this by ensuring the Online Harms Bill gives a regulator the tools and sanctions necessary to hold big tech to account.”

Harry Pererra
Harry Pererra

Harry turns on his experience in journalism and programming to write about the latest news in the world of tech and the environemtn. When he isn’t writing for usave he is working towards his Blue Belt in Brazilian Jiu Jitsu, and prefers dogs to cats.

Read all articlesRead all articles

Read on our blog

With the government poised to implement tough new measures to...

TalkTalk Confirms Huge Bills Hikes from Friday
Broadband
30. 03. 2022 | Lauren Smith

Budget broadband provider TalkTalk has been notifying customers via email...

A year-long investigation by charity Citizens Advice has revealed a...

All English Schools Will Have Gigabit Broadband by 2025
Broadband
23. 03. 2022 | Lauren Smith

Education Secretary Nadhim Zahawi has announced a new commitment to...