Tech firms have been urged to show that they are removing extremist content more rapidly or face legislation forcing them to do so, new EU guidelines reveal.
Google, YouTube, Facebook, Twitter and others have been given three months to clean up their acts and tackle terrorist material published on their sites.
The guidance, which is not legally binding, includes a call to remove such material within an hour of being notified of its existence.
Failure to comply could result in new laws being brought in to make it a mandatory requirement.
Tech firms have been urged to show that they are removing extremist content more rapidly or face legislation forcing them to do so, new EU guidelines reveal. Google, YouTube, Facebook, Twitter and others have been given three months to clean up their acts
New measures include a call to remove such material within an hour of being notified of its existence. This image shows masked figures marching in a recruitment video for the banned far-Right National Action, which has since been taken down from YouTube
In its strongest call yet the European Commission, based in Brussels, today recommended a range of new measures that online platforms should take to stop the proliferation of extremist content.
European governments have said that extremist content on the web has influenced lone-wolf attackers who have killed people in several European cities after being radicalised.
Several governments have increased pressure on social media companies to do more to remove illegal content.
This includes material related to groups like Islamic state as well as individual incitements to commit atrocities.
‘While several platforms have been removing more illegal content than ever before, we still need to react faster against terrorist propaganda and other illegal content which is a serious threat to our citizens’ security, safety and fundamental rights,’ digital commissioner Andrus Ansip said in a written statement.
The recommendation, which is non-binding but could be taken into account by European courts, sets guidelines on how companies should remove illegal content generally.
This ranges from copyright infringements to hate speech and advises a quicker reaction to extremist material.
The Commission said it would assess the need for legislation within three months for what it described as ‘terrorist content’, given the urgency of the issue.
Several governments have increased pressure on social media companies like Twitter to do more to remove illegal content. This includes material related to groups like Islamic state as well as individual incitements to commit atrocities
For all other types of illegal content it will assess progress made within six months.
It also called on the technology sector, which is dominated by U.S. companies, to adopt proactive measures such as automated detection to rid their platforms of illegal content.
European Digital Rights, a civil rights group, described the Commission’s approach as putting internet giants in charge of censoring Europe.
Only legislation would ensure democratic scrutiny and judicial review, they said.
‘The European Commission is pushing “voluntary” censorship to internet giants to avoid legislation that would be subject to democratic scrutiny and judicial challenge,’ said Joe McNamee, executive director of the group.
‘Today’s Recommendation institutionalises a role for Facebook and Google in regulating the free speech of Europeans.
‘The Commission needs to be smart and to finally start developing policy based on reliable data and not public relations spin,’ he added.
Luxury groups, meanwhile, welcomed the Commission’s move saying action by online platforms is also necessary to fight the sale of counterfeit goods online.
‘Proactive measures coupled with good consumer information is the only way to effectively deal with illegal content online,’ said Toni Belloni, group managing director of LVMH.