Social media has been heavily criticised by a committee of British politicians for failing to do enough to remove illegal and extremist material posted on their sites and for not preventing it appearing in the first place.
Platforms including Twitter, YouTube and Facebook have been criticised over their moderation policies after high-profile cases in which violent or abusive material has been posted online and, in some cases, not been removed even after they were notified.
The committee's report said it had found repeated examples of extremist material, including from banned jihadist and neo-Nazi groups not being removed, even after it had been reported.
"Social media companies' failure to deal with illegal and dangerous material online is a disgrace," said Yvette Cooper, chairwoman of parliament's Home Affairs Select Committee.
"They have been asked repeatedly to come up with better systems to remove illegal material such as terrorist recruitment or online child abuse. Yet repeatedly they have failed to do so. It is shameful."
The committee said the government needed to strengthen the laws regarding publishing such material and called on social media companies to pay for the cost of policing online content and publicly report details of their moderating.
Responding to the report, the government said it expected to see early and effective action from social media to develop the tools needed to identify and remove "terrorist propaganda".
"We have made it very clear that we will not tolerate the internet being used as a place for terrorists to promote their vile views, or use social media platforms to weaponise the most vulnerable people in our communities," interior minister Amber Rudd said.