YouTube is still failing to take down jihadi propaganda and missing its target on removing IS films

YouTube repeatedly fails to remove jihadist videos within two hours of them being posted because of ‘staggering’ holes in its monitoring, according to a study.

It found that the Google-owned video sharing site missed its target for taking down Islamic State films in one in four cases.

Dozens of terrorist propaganda and recruitment videos were left up for more than three days at a time, clocking up tens of thousands of views, according to the three-month study by the Counter Extremism Project (CEP).

YouTube is failing to remove flagged jihadi videos because of ‘staggering’ holes in its system, the Counter Extremism Project found.

Six in 10 of the IS supporters who posted the hate videos were not even banned from the site and their accounts remain active, it found.

The failings come after YouTube rejected an offer of free technology to instantly block any previously identified extremist content, preferring to develop its own system that it says deletes millions of banned videos before they are seen.

At the G7 summit in October last year, YouTube joined with Facebook, Twitter and Microsoft in an accord aimed at removing extremist content from their platforms within two hours.

But in the first in-depth independent study of IS videos on YouTube, the CEP found this was not happening because of ‘inexcusable’ holes in the service’s monitoring system.

The deadline is two hours but dozens stayed up for three days, study finds. 

The deadline is two hours but dozens stayed up for three days, study finds. 

Researchers found 229 previously identified terror videos were uploaded 1,348 times and viewed on 163,000 occasions over three months from March 8 to June 8, with 24 per cent left on the site for more than two hours.

They included the film Caliphate 4 – uploaded six times during the trial period – in which a terrorist taunts former soldier Prince Harry, saying: ‘Why don’t you come here and fight us if you’re man enough, so we can send you and your Apaches to hellfire?’

Another video called Hunt Them O Monotheist was uploaded 12 times during the study and on one occasion allowed to remain up for 39 hours.

It urges Muslims to launch terror attacks in cities including London and Birmingham, with a voice-over saying: ‘Don’t hesitate to attack their concentrations even if there are children.’

Another, which was uploaded 17 times, was a bomb-making instruction video called You Must Fight Them O Monotheist, which was used by Manchester bomber Salman Abedi.

Computer scientist Dr Hany Farid, from Dartmouth College in the US, who developed a system that stops known child abuse films being uploaded, created a similar program that instantly identifies and removes terror videos.

YouTube, Facebook and Google were all offered the eGlyph system free by the CEP in 2016 but decided not use it.

Dr Farid said it was ‘infuriating’ that companies worth billions refused to implement systems that could instantly stop jihadist videos. ‘Spectacular failures are allowing terror groups to continue to radicalise and recruit online,’ he added.

Former Tory minister Mark Simmonds, now a senior adviser to CEP, said: ‘This study dispels any lingering myth that YouTube are doing enough to stop their site being used as an IS recruitment tool.

‘The research shows that YouTube are not even meeting their own promise to delete all extremist content within two hours. For them to fail in a quarter of all cases, with much of the content still available three days or more after first being uploaded, is unacceptable.’

He added: ‘Even videos that stayed online for less than two hours received a total of nearly 15,000 hits – any one could become a potential terrorist.

‘It is staggering and inexcusable that well over half of the IS supporters who upload this dangerous content are not even banned and their accounts remain active … spreading IS propaganda and grooming potential recruits.’

Google said it ‘rejects terrorism and has a strong track record of taking swift action against terrorist content’.

A spokesman added: ‘We’ve invested heavily in people and technology to ensure we keep making progress to detect and remove terror content as quickly as possible.

‘We’re a founding member of the Global Internet Forum to Counter Terrorism, which sees tech companies collaborate to keep terror content off the web.’

 



Read more at DailyMail.co.uk