Platforms such as Twitter, YouTube and Facebook has been criticised over their moderation policies after high-profile cases in which violent or abusive material has been posted online and, in some cases, not been removed even after they were notified.
Facebook failed to remove reported militant posts - The Times
The committee's report said it had found repeated examples of extremist material, including from banned militant and neo-Nazi groups not being removed, even after it had been reported.
"Social media companies’ failure to deal with illegal and dangerous material online is a disgrace," said Yvette Cooper, chairperson of parliament's Home Affairs Select Committee. "They have been asked repeatedly to come up with better systems to remove illegal material such as militant recruitment or online child abuse. Yet repeatedly they have failed to do so. It is shameful."
New arrest over London attack as govt eyes WhatsApp
The committee said the government needed to strengthen the laws regarding publishing such material and called on social media companies to pay for the cost of policing online content and publicly report details of their moderating. Responding to the report, the government said it expected to see early and effective action from social media to develop the tools needed to identify and remove "terrorist propaganda."
"We have made it very clear that we will not tolerate the internet being used as a place for militants to promote their vile views, or use social media platforms to weaponise the most vulnerable people in our communities," interior minister Amber Rudd said.
COMMENTS
Comments are moderated and generally will be posted if they are on-topic and not abusive.
For more information, please see our Comments FAQ