YouTube to expand teams reviewing extremist content

YouTube has promised to enlist 10,000 staff in a bid to root out violent extremism and dangerous content.

Susan Wojcicki, the chief executive of the video sharing site which is owned by Google, admitted people are exploiting the website to ‘mislead, manipulate, harass or even harm’ other users, including children.

She revealed that since June, YouTube enforcement teams have reviewed two million videos for extremist content – removing 150,000 from the site.

 

YouTube has promised to enlist 10,000 staff in a bid to root out violent extremism and dangerous content (stock image)

YOUTUBE AND EXTREMISM

YouTube has promised to enlist 10,000 staff in a bid to root out violent extremism and dangerous content.

Susan Wojcicki, the chief executive of the video sharing site, revealed that since June YouTube enforcement teams have reviewed two million videos for extremist content – removing 150,000 from the site.

It is estimated that about half a million hours of content are posted on the site every day, and an official report published today by David Anderson QC, the former independent reviewer of terrorism legislation, is expected to highlight the huge task faced by MI5 to monitor those who post online. 

Videos that were removed include posts by jihadists sympathetic to terrorist groups and bomb-making instructions. 

The Daily Mail has repeatedly called on social media sites to take down extremist content and videos that exploit young children following five terror attacks in the UK this year.

Last month it emerged that Google, which owns YouTube, has raked in millions of pounds in advertising revenue from the clips.

Writing in The Daily Telegraph, Miss Wojcicki said that YouTube has already developed technology that can identify videos containing extremist content.

The ‘computer-learning’ technology will now be used to track videos that spread hate speech or pose a risk to children.

She said: ‘We will continue the significant growth of our teams, with the goal of bringing the total number of people across Google working to address content that might violate our policies to over 10,000 in 2018.’

Around 98 per cent of videos that were removed were initially flagged by the ‘computer learning’ algorithms. 

Almost half were deleted within two hours of being uploaded, and 70 per cent were taken down within eight hours.

Videos that were removed include posts by jihadists sympathetic to terrorist groups and bomb-making instructions.

It is estimated that about half a million hours of content are posted on the site every day, and an official report published today by David Anderson QC, the former independent reviewer of terrorism legislation, is expected to highlight the huge task faced by MI5 to monitor those who post online.

At the end of last month it emerged YouTube's system for reporting sexual comments on children's videos has not been working for more than a year. As many as 100,000 predatory accounts leaving inappropriate comments (pictured) on videos

At the end of last month it emerged YouTube’s system for reporting sexual comments on children’s videos has not been working for more than a year. As many as 100,000 predatory accounts leaving inappropriate comments (pictured) on videos

SEXUAL COMMENTS

At the end of last month it emerged YouTube’s system for reporting sexual comments on children’s videos has not been working for more than a year.

As a result, volunteer moderators have revealed there could be as many as 100,000 predatory accounts leaving inappropriate comments on videos.

Users use an online form to report accounts they find inappropriate. 

Part of this process involves sending links to the specific videos or comments they are referring to. 

Investigators identified 28 comments that obviously violated YouTube’s guidelines. 

According to the BBC, some include the phone numbers of adults, or requests for videos to satisfy sexual fetishes. 

The children in the videos appeared to be younger than 13, the minimum age for registering an account on YouTube. 

Miss Wojcicki added: ‘Our goal is to stay one step ahead, making it harder for policy-violating content to surface or remain on YouTube.

‘We will use our cutting-edge machine learning more widely to allow us to quickly remove content that violates our guidelines.’

At the end of last month it emerged YouTube’s system for reporting sexual comments on children’s videos has not been working for more than a year.

As a result, volunteer moderators have revealed there could be as many as 100,000 predatory accounts leaving inappropriate comments on videos.

Users use an online form to report accounts they find inappropriate. 

Part of this process involves sending links to the specific videos or comments they are referring to.

An investigation byBBC Trending found that when the public submitted information on the form, associated links were sometimes missing.

Investigators identified 28 comments that obviously violated YouTube’s guidelines. 

According to the BBC, some include the phone numbers of adults, or requests for videos to satisfy sexual fetishes. 

The children in the videos appeared to be younger than 13, the minimum age for registering an account on YouTube. 

‘There are loads of things YouTube could be doing to reduce this sort of activity, fixing the reporting system to start with’, an anonymous flagger said.

‘But for example, we can’t prevent predators from creating another account and have no indication when they do, so we can take action.’

BRANDS PULL ADVERTS FROM YOUTUBE

Following the investigation, several firms have decided to pull their adverts from YouTube. 

Both Mars and Lidl are among the brands to pull their adverts on Black Friday. 

A spokesperson for Mars said: ‘We are shocked and appalled to see that our adverts have appeared alongside such exploitative and inappropriate content.

‘It is in stark contrast to who we are and what we believe.

‘We have taken the decision to immediately suspend all our online advertising on YouTube and Google globally.

‘We have stringent guidelines and processes in place and are working with Google and our media buying agencies to understand what went wrong.

‘Until we have confidence that appropriate safeguards are in place, we will not advertise on YouTube and Google.’

And a spokesperson for Lidl said the firm was ‘shocked and disturbed’ by the investigation.  

They estimated there were ‘between 50,000 to 100,000 active predatory accounts still on the platform’.

Some comments were extremely sexual explicitly and were posted on videos clearly made for children.

The report was revealed the same day Lidl, Mars, Adidas, Cadbury maker Mondelez, Diageo and other big companies pulled advertising from YouTube.

An investigation found the video sharing site was showing clips of scantily clad children alongside the ads of major brands.

Comments from hundreds of paedophiles were posted alongside the images, which appeared to have been uploaded by the children themselves, according to an investigation.

One video of a pre-teenage girl in a nightie drew 6.5 million views. 

MailOnline has contacted YouTube for comment.  

 



Read more at DailyMail.co.uk