YouTube tightens rules to protect advertisers

YouTube has agreed to manually review all videos in its ‘preferred’ section so advertisers are sure they are not promoting harmful videos. 

The move comes amid criticism that adverts are being shown on some clips depicting child exploitation.

As well as manual reviews, YouTube has tightened its rules on who qualifies for posting money-making ads.  

Previously, channels with 10,000 total views qualified for the YouTube Partner Program which allows creators to collect some income from the adverts placed before their videos.

But YouTube’s parent company Google has announced that from February 20, channels will need 1,000 subscribers and to have racked up 4,000 hours of watch time over the last 12 months regardless of total views, to qualify.

This is the biggest change to advertising rules on the site since its inception – and is another attempt to prevent the platform being ‘co-opted by bad actors’ after persistent complaints from advertisers over the past twelve months.

 

YouTube is tightening its rules on who can make money from adverts to prevent the platform being ‘co-opted by bad actors’ (stock image)

BIG COMPANIES PULL ADVERTISING FROM YOUTUBE

In November last year Lidl, Mars, Adidas, Cadbury maker Mondelez, Diageo and other big companies all pulled advertising from YouTube.

An investigation found the video sharing site was showing clips of scantily clad children alongside the ads of major brands.

One video of a pre-teenage girl in a nightie drew 6.5 million views.

An investigation by The Times found YouTube, a unit of Alphabet subsidiary Google , had allowed sexualised imagery of children to be easily searchable and not lived up to promises to better monitor and police its services to protect children. 

The investigation also found BT, Deutsche Bank, eBay, Amazon, and Talktalk also had adverts that appeared next to the inappropriate videos.

According to the investigation there are dozens of them in total, although it does not provide a comprehensive list. 

This comes just weeks after YouTuber Logan Paul’s video showing the body of a suicide victim reached the site’s trending page before being removed.

YouTube’s new threshold means a creator making a weekly ten-minute video would need 1,000 subscribers and an average of 462 views per video to start receiving ad revenue.

The Government, Marks & Spencer and the BBC were amongst several advertisers that pulled ads from the platform last year after it was revealed their messages were appearing alongside extremist content.

YouTuber Felix Kjellberg, known as PewDiePie to his 59 million subscribers, seemed to agree with the new rules.

‘This is YouTube’s response to the Logan Paul thing,’ he said.

‘It shouldn’t be an issue to not to monetise before you hit these numbers.’

He added that he did not earn ad revenue from the site until he had reached 25,000 subscribers. 

‘As much as I typically hate [YouTube’s] business practice, this one makes sense to me.’

He is no stranger to YouTube controversy, having had his own premium advertising links cut by Google last year after allegations of antisemitism in his videos.

But several responses to YouTube CEO Susan Wojcicki’s announcement accused her of ‘punishing small creators’ who are unlikely to reach the new thresholds.

 One YouTuber, Beanie Draws, who has 13,000 subscribers, told her: ‘Punishing small creators even further while [sic] this action won’t stop people like Logan Paul.’

‘Delete his channel completely if you’re serious.’

YouTube will also change Google Preferred – its programme for advertisers allowing them to place ads on the site’s most popular clips.

The videos on this list will now be ‘manually reviewed’ to protect advertisers from unwittingly putting their ads on popular, but problematic videos.

They are also introducing a ‘three-tier suitability system’ for brands to decide what kind of videos they want their ads to run on.

Earlier this month twenty-two-year-old Logan Paul garnered millions of views on his video filmed in the Aokigahara Forest, a well-known site for suicides, in Japan.

His vlogging channel has around 15 million subscribers, and the videos on there have been viewed more than three billion times since its creation in 2015.

The vlogger took the video down a day after posting it following widespread outrage and issued a grovelling apology.

‘I’m so sorry about this… suicide is not a joke. Depression and mental illnesses are not a joke,’ he said.

He claimed he had intended to raise awareness about suicide by filming the video.  

‘If this video saves just ONE life, it’ll be worth it,’ he said.

‘I was misguided by shock and awe, as portrayed in the video. I still am.’

The video showed Logan walking through the forest near Mount Fuji with two friends when they came across the corpse of a man hanging from a tree.

Paul zoomed his camera to focus on the man’s body before yelling out: ‘Yo, are you alive? Are you f**king with us?’

Earlier this month twenty-two-year-old Logan Paul garnered millions of views on his video filmed in the Aokigahara Forest, a well-known site for suicides, in Japan

Earlier this month twenty-two-year-old Logan Paul garnered millions of views on his video filmed in the Aokigahara Forest, a well-known site for suicides, in Japan

YOUTUBE’S REPORTING SYSTEM

An investigation in November found YouTube’s system for reporting sexual comments had serious faults.

As a result, volunteer moderators have revealed there could be as many as 100,000 predatory accounts leaving inappropriate comments on videos.

Users use an online form to report accounts they find inappropriate.

Part of this process involves sending links to the specific videos or comments they are referring to.

However, an investigation by BBC Trending found that when the public submitted information on the form, associated links were sometimes missing.

Investigators identified 28 comments that obviously violated YouTube’s guidelines.

According to the BBC, some include the phone numbers of adults, or requests for videos to satisfy sexual fetishes.

The children in the videos appeared to be younger than 13, the minimum age for registering an account on YouTube.

He went onto to say that the man’s hands were purple, which meant he must have only committed suicide hours before.

YouTube has cancelled Logan Paul’s lucrative Google Preferred ad deal and limited the media star’s ability to earn money from the platform. 

As a result of the controversial video, the company is now using human moderators and AI to spot if preferred videos are appropriate for ads. 

The company’s ‘Preferred program’ allows companies to exclusively publish ads on videos of the top five per cent of YouTube creators that command the highest prices with advertisers.

In November last year Lidl, Mars, Adidas, Cadbury maker Mondelez, Diageo and other big companies all pulled advertising from YouTube.

An investigation found the video sharing site was showing clips of scantily clad children alongside the ads of major brands.

One video of a pre-teenage girl in a nightie drew 6.5 million views. 

Another investigation in November found YouTube’s system for reporting sexual comments had serious faults. 

As a result, volunteer moderators have revealed there could be as many as 100,000 predatory accounts leaving inappropriate comments on videos.

Users use an online form to report accounts they find inappropriate.

Part of this process involves sending links to the specific videos or comments they are referring to.

Investigators identified 28 comments that obviously violated YouTube’s guidelines.

According to the BBC, some include the phone numbers of adults, or requests for videos to satisfy sexual fetishes.

The children in the videos appeared to be younger than 13, the minimum age for registering an account on YouTube.



Read more at DailyMail.co.uk