YouTube says it took down more than 8 MILLION videos in 3 months

YouTube has taken down more than eight million videos in three months for violating community guidelines as the true scale of offensive material posted on the site is revealed.

Included in the deleted videos was footage of terrorism, child abuse and hate speech content – 80 per cent of which were flagged by machines. 

The Google-owned company said the majority of deleted videos were ‘spam or people attempting to upload adult content’.

Just last week it was revealed YouTube was still airing adverts on extremist videos promoting Nazism and paedophilia. 

At the end of last year the Google-owned company announced it would be hiring 10,000 people to better monitor content in 2018.  

YouTube has taken down more than eight million videos in three months for violating community guidelines as the true scale of offensive material is revealed (stock image)

The information was included in the company’s first quarterly report on its performance between October and December last year.

According to the report, 6.7 million of the videos were flagged by machines and not humans.

Out of those, 76 per cent were removed before receiving any views from users.

‘Machines are allowing us to flag content for review at scale, helping us remove millions of violative videos before they are ever viewed’, the company wrote in a blog post.

‘At the beginning of 2017, 8 per cent of the videos flagged and removed for violent extremism were taken down with fewer than 10 views.

‘We introduced machine learning flagging in June 2017. Now more than half of the videos we remove for violent extremism have fewer than 10 views’, the company wrote.

YouTube has faced multiple complaints from advertisers who claim the company is failing to take down offensive content on the site.

Between October and December last year there were 28.7 million videos reported from YouTube users.

Nine million of those videos (30 per cent) were said to contain sexually explicit content and eight million (25 per cent) were said to be spam or misleading. 

More than half videos removed for violent extremism have fewer than 10 views (pictured). At the beginning of 2017, 8 per cent of the videos flagged and removed for violent extremism were taken down with fewer than 10 views

More than half videos removed for violent extremism have fewer than 10 views (pictured). At the beginning of 2017, 8 per cent of the videos flagged and removed for violent extremism were taken down with fewer than 10 views

The company has had several issues over its content in recent years.  

Last week it was revealed more than 300 companies and organisations ran adverts on YouTube channels promoting offensive content.

Many of the companies – which included Adidas, Amazon and Netflix – said they were unaware their adverts had been placed on these channels and would be investigating. 

According to a an investigation by CNN, companies such as Adidas, Amazon, Cisco, Facebook, Hershey, Hilton, LinkedIn, Mozilla, Netflix, Nordstrom and Under Armour may have unknowingly helped finance these channels.

It seems public money also went to fund this offensive content, including adverts from the US Department of Transportation and Centres for Disease Control.

The site has more than a billion users who watch one billion hours of footage each day.

Advertisers have to trust YouTube to decide what videos are appropriate for their adverts.

WHAT HAS YOUTUBE DONE TO IMPROVE ITS MODERATION?

At the end of last year the Google-owned company announced it would be hiring 10,000 people to better monitor content in 2018 amid concerns offensive content was still making it on the site.  

Susan Wojcicki, the chief executive of the video sharing site, revealed that since June last year YouTube enforcement teams have reviewed two million videos for extremist content – removing 150,000 from the site.

Around 98 per cent of videos that were removed were initially flagged by the ‘computer learning’ algorithms.

Almost half were deleted within two hours of being uploaded, and 70 per cent were taken down within eight hours. 

Miss Wojcicki added: ‘Our goal is to stay one step ahead, making it harder for policy-violating content to surface or remain on YouTube.

‘We will use our cutting-edge machine learning more widely to allow us to quickly remove content that violates our guidelines.’ 

Earlier this year, YouTube’s parent company Google has announced that from February 20, channels will need 1,000 subscribers and to have racked up 4,000 hours of watch time over the last 12 months regardless of total views, to qualify.

Previously, channels with 10,000 total views qualified for the YouTube Partner Program which allows creators to collect some income from the adverts placed before their videos. 

This threshold means a creator making a weekly ten-minute video would need 1,000 subscribers and an average of 462 views per video to start receiving ad revenue. 

This is the biggest change to advertising rules on the site since its inception – and is another attempt to prevent the platform being ‘co-opted by bad actors’ after persistent complaints from advertisers over the past twelve months. 

YouTube’s new threshold means a creator making a weekly ten-minute video would need 1,000 subscribers and an average of 462 views per video to start receiving ad revenue. 

American clothes company Under Armour is pausing advertising after its commercials appeared next to a YouTube channel called ‘Wife With A Purpose’ which promoted white nationalism.

‘We have strong values-led guidelines in place and are working with YouTube to understand how this could have slipped through the guardrails’, a company spokesperson said.

‘We take these matters very seriously and are working to rectify this immediately,’ the spokesperson said.

Adverts for Mozilla and 20th Century Fox Film were placed on a Nazi YouTube channel. YouTube later deleted the channel for violating community guidelines.

However, when contacted for comment, Brian Ruhe who ran the channel said he did not want to be referred to as a ‘neo-Nazi’, telling CNN he was a ‘real, genuine and sincere Nazi’. 

‘We have partnered with our advertisers to make significant changes to how we approach monetisation on YouTube with stricter policies, better controls and greater transparency’, a YouTube spokesperson told MailOnline.  

‘When we find that ads mistakenly ran against content that doesn’t comply with our policies, we immediately remove those ads. 

‘We know that even when videos meet our advertiser friendly guidelines, not all videos will be appropriate for all brands. But we are committed to working with our advertisers and getting this right’, the spokesperson said.  

The incident has again raised concerns about whether brands who advertise on YouTube can safeguard the brand’s integrity. 

Adverts for Mozilla and 20th Century Fox Film were placed on a Nazi YouTube channel. YouTube later deleted the channel (pictured) for violating community guidelines

Adverts for Mozilla and 20th Century Fox Film were placed on a Nazi YouTube channel. YouTube later deleted the channel (pictured) for violating community guidelines

In November Lidl, Mars, Adidas, Cadbury maker Mondelez, Diageo and other big companies all pulled advertising from YouTube.

An investigation found the video sharing site was showing clips of scantily clad children alongside the ads of major brands.

One video of a pre-teenage girl in a nightie drew 6.5 million views.

An investigation by The Times found YouTube, a unit of Alphabet subsidiary Google , had allowed sexualised imagery of children to be easily searchable and not lived up to promises to better monitor and police its services to protect children.

The investigation also found BT, Deutsche Bank, eBay, Amazon, and Talktalk also had adverts that appeared next to the inappropriate videos. 

At the end of last year YouTube said it had removed more than 50 user channels and has stopped running ads on more than 3.5 million videos since June.

WHAT’S THE CONTROVERSY OVER YOUTUBE’S CONTENT?

YouTube has been subject to various controversies since its creation in 2005. 

It has become one of Google’s fastest-growing operations in terms of sales by simplifying the process of distributing video online but putting in place few limits on content.

However, parents, regulators, advertisers and law enforcement have become increasingly concerned about the open nature of the service. 

They have contended that Google must do more to banish and restrict access to inappropriate videos, whether it be propaganda from religious extremists and Russia or comedy skits that appear to show children being forcibly drowned. 

Child exploitation and inappropriate content

By the end of last year YouTube said it had removed more than 50 user channels and has stopped running ads on more than 3.5 million videos since June.

In March last year, a disturbing Peppa Pig fake, found by journalist Laura June, shows a dentist with a huge syringe pulling out the character’s teeth as she screams in distress.

Mrs June only realised the violent nature of the video as her three-year-old daughter watched it beside her.

Hundreds of these disturbing videos were found on YouTube by BBC Trending back in March.

By the end of last year YouTube said it had removed more than 50 user channels and has stopped running ads on more than 3.5 million videos since June. One of the deleted videos was the wildly popular Toy Freaks YouTube channel featuring a single dad and his two daughters

By the end of last year YouTube said it had removed more than 50 user channels and has stopped running ads on more than 3.5 million videos since June. One of the deleted videos was the wildly popular Toy Freaks YouTube channel featuring a single dad and his two daughters

All of these videos are easily accessed by children through YouTube’s search results or recommended videos. 

YouTube has been getting more stringent about deleting videos. One example is the wildly popular Toy Freaks YouTube channel featuring a single dad and his two daughters that was deleted last year.

Although it’s unclear what exact policy the channel violated, the videos showed the girls in unusual situations that often involved gross-out food play and simulated vomiting.

The channel invented the ‘bad baby’ genre, and some videos showed the girls pretending to urinate on each other or fishing pacifiers out of the toilet.

Adverts being shown next to inappropriate videos

There has been widespread criticism that adverts are being shown on some clips depicting child exploitation.

YouTube has now tightened its rules on who qualifies for posting money-making ads.

Previously, channels with 10,000 total views qualified for the YouTube Partner Program which allows creators to collect some income from the adverts placed before their videos.

But YouTube’s parent company Google has announced that from February 20, channels will need 1,000 subscribers and to have racked up 4,000 hours of watch time over the last 12 months regardless of total views, to qualify.

This is the biggest change to advertising rules on the site since its inception – and is another attempt to prevent the platform being ‘co-opted by bad actors’ after persistent complaints from advertisers over the past twelve months.

In November last year Lidl, Mars, Adidas, Cadbury maker Mondelez, Diageo and other big companies all pulled advertising from YouTube.

An investigation found the video sharing site was showing clips of scantily clad children alongside the ads of major brands.

One video of a pre-teenage girl in a nightie drew 6.5 million views.

Issues with system for flagging inappropriate videos

Another investigation in November found YouTube’s system for reporting sexual comments had serious faults.

As a result, volunteer moderators have revealed there could be as many as 100,000 predatory accounts leaving inappropriate comments on videos.

Users use an online form to report accounts they find inappropriate.

Part of this process involves sending links to the specific videos or comments they are referring to.

Investigators identified 28 comments that obviously violated YouTube’s guidelines.

According to the BBC, some include the phone numbers of adults, or requests for videos to satisfy sexual fetishes.

The children in the videos appeared to be younger than 13, the minimum age for registering an account on YouTube.

 



Read more at DailyMail.co.uk