YouTube has been subject to various controversies since its creation in 2005.
It has become one of Google’s fastest-growing operations in terms of sales by simplifying the process of distributing video online but putting in place few limits on content.
However, parents, regulators, advertisers and law enforcement have become increasingly concerned about the open nature of the service.
They have contended that Google must do more to banish and restrict access to inappropriate videos, whether it be propaganda from religious extremists and Russia or comedy skits that appear to show children being forcibly drowned.
Child exploitation and inappropriate content
By the end of last year YouTube said it had removed more than 50 user channels and has stopped running ads on more than 3.5 million videos since June.
In March last year, a disturbing Peppa Pig fake, found by journalist Laura June, shows a dentist with a huge syringe pulling out the character’s teeth as she screams in distress.
Mrs June only realised the violent nature of the video as her three-year-old daughter watched it beside her.
Hundreds of these disturbing videos were found on YouTube by BBC Trending back in March.
By the end of last year YouTube said it had removed more than 50 user channels and has stopped running ads on more than 3.5 million videos since June. One of the deleted videos was the wildly popular Toy Freaks YouTube channel featuring a single dad and his two daughters
All of these videos are easily accessed by children through YouTube’s search results or recommended videos.
YouTube has been getting more stringent about deleting videos. One example is the wildly popular Toy Freaks YouTube channel featuring a single dad and his two daughters that was deleted last year.
Although it’s unclear what exact policy the channel violated, the videos showed the girls in unusual situations that often involved gross-out food play and simulated vomiting.
The channel invented the ‘bad baby’ genre, and some videos showed the girls pretending to urinate on each other or fishing pacifiers out of the toilet.
Adverts being shown next to inappropriate videos
There has been widespread criticism that adverts are being shown on some clips depicting child exploitation.
YouTube has now tightened its rules on who qualifies for posting money-making ads.
Previously, channels with 10,000 total views qualified for the YouTube Partner Program which allows creators to collect some income from the adverts placed before their videos.
But YouTube’s parent company Google has announced that from February 20, channels will need 1,000 subscribers and to have racked up 4,000 hours of watch time over the last 12 months regardless of total views, to qualify.
This is the biggest change to advertising rules on the site since its inception – and is another attempt to prevent the platform being ‘co-opted by bad actors’ after persistent complaints from advertisers over the past twelve months.
In November last year Lidl, Mars, Adidas, Cadbury maker Mondelez, Diageo and other big companies all pulled advertising from YouTube.
An investigation found the video sharing site was showing clips of scantily clad children alongside the ads of major brands.
One video of a pre-teenage girl in a nightie drew 6.5 million views.
Issues with system for flagging inappropriate videos
Another investigation in November found YouTube’s system for reporting sexual comments had serious faults.
As a result, volunteer moderators have revealed there could be as many as 100,000 predatory accounts leaving inappropriate comments on videos.
Users use an online form to report accounts they find inappropriate.
Part of this process involves sending links to the specific videos or comments they are referring to.
Investigators identified 28 comments that obviously violated YouTube’s guidelines.
According to the BBC, some include the phone numbers of adults, or requests for videos to satisfy sexual fetishes.
The children in the videos appeared to be younger than 13, the minimum age for registering an account on YouTube.