YouTube removes less than half of hate content reported

YouTube takes down less than half hate content that gets reported, according to a shocking new report by a think tank.

The material that does get taken down – which includes Islamist extremist postings – is left for almost two weeks before being removed, researchers found.    

One of the videos that YouTube refused to remove included a video of a man filmed slapping a Muslim teenager with bacon and calling him ‘ISIS scum’.

The report is the latest of many to expose inappropriate content being posted and promoted on the Google-owned site.

YouTube takes down less than half hate content that gets reported, according to a shocking new report by a think tank (stock image)

HATE SPEECH

According to YouTube hate speech refers to content that promotes hatred against individuals based on certain attributes such as their ethnic origin, religion, disability, gender, age, veteran status or sexual orientation.

There is a fine line between what and what is not considered hate speech.

For example, it is fine to criticise a nation but promoting hatred based on people’s ethnicity violates YouTube’s policy.

If users are unhappy about specific content and believe it violates community guidelines they can flag videos.

In order to do this on mobile, go to the video you wish to report and tap the three buttons at the top. Then tap report. You will be asked to select a reason for flagging that video.

On desktop, click ‘more’ under the video and in the drop-down menu click report.

If users find multiple hate videos or comments posted from one YouTuber, users can submit a more detailed report. YouTube has a reporting tool where this can be done.

The three-month long experiment was run by the London-based Henry Jackson Society think tank and involved researchers reporting alarming material weekly, writes The Sun.

Only 47 out of 107 Islamist extremist postings that were reported were removed and for ones that were it took on average 11 and a half days for them to be taken down. 

Out of 94 far right movies that promoted racial violence, just 33 were removed. 

On average it took 13 and a half days for them to be taken down and 121 extremist videos were fully viewable – despite being reported by researchers.

‘These ideologies can be freely disseminated and amplified online, and there is room for improvement by technology firms to provide spaces to expose and debate their inconsistencies’, said Dr Alan Mendoza, Executive Director of the Henry Jackson Society.

According to YouTube hate speech refers to content that promotes hatred against individuals based on certain attributes such as their ethnic origin, religion, disability, gender, age, veteran status or sexual orientation.

There is a fine line between what and what is not considered hate speech.

For example, it is fine to criticise a nation but promoting hatred based on people’s ethnicity violates YouTube’s policy.

One video that praised Hitler and showed Jewish people being sent to concentration camps was not removed. A film promoting the Taliban was also left up. 

‘We know social media can play a role in the radicalisation of young people, drawing them in with twisted and warped ideology’, said former Labour Cabinet minister Yvette Cooper, who commissioned the study on behalf of the Commons Home Affairs Select Committee.

She said the findings were ‘simply unacceptable’.

‘YouTube have promised to do more, but they just aren’t moving fast enough. Google, which owns YouTube, is one of the richest and most innovative companies on the planet. They have the resources and capability to sort this and they need to do so fast.’    

A YouTube spokesperson told MailOnline the Google-owned site was making progress on removing hate videos.

‘Through new uses of technology, the majority of videos we removed for violent extremism over the past few months were taken down before receiving a single human flag. 

‘We’re doing more every day to tackle these complex issues’, the spokesperson said. 

Earlier this month it was revealed YouTubers have made hundreds of thousands in advertising revenue each month by putting up disturbing videos of children. 

One example is the Toy Freaks YouTube channel. It was founded by landscaper Greg Chism of Granite City, Illinois, and had 8.53 million subscribers.

Earlier this month it was revealed YouTubers - such as Greg Chism of Granite City (pictured) - have made hundreds of thousands in advertising revenue each month by putting up disturbing videos of children

Earlier this month it was revealed YouTubers – such as Greg Chism of Granite City (pictured) – have made hundreds of thousands in advertising revenue each month by putting up disturbing videos of children

Mr Chism's channel, which features his daughters Annabelle and Victoria, appears to have been caught up in a broader purge targeting a wave of strange and inappropriate kids videos

Mr Chism’s channel, which features his daughters Annabelle and Victoria, appears to have been caught up in a broader purge targeting a wave of strange and inappropriate kids videos

YOUTUBE’S FIVE-POINT PLAN

Last month the company announced is stepping up enforcement of its guidelines on these videos after widespread criticism that has failed to protect children from adult content.

YouTube vice president Johanna Wright explained their five-point plan in a blog post.

1. YouTube is removing content featuring minors that may be endangering the child, even if that is not what the person who made the video intended.  

50 channels have been removed because of this, including the wildly popular Toy Freaks YouTube channel featuring a single dad and his two daughters.

2. It has removed adverts from inappropriate videos targeting families. Since June they have removed three million videos under this policy.

3. YouTube is blocking inappropriate comments on features with children in.

4. In the coming weeks the company is releasing a guide on how creators can enrich family-friendly content.

5. They are doubling the number of people who flag content on the main YouTube site that is not appropriate for viewers under 18 years old.

It was among the 100 most-viewed YouTube channels, and was probably making between $69,000 (£52,000) to $1.1 million (£820,000) per month, according to estimates by Social Blade, which tracks YouTube Channel Statistics.

Of this, YouTube would have been taking a 45 per cent cut.

Though it’s unclear what exact policy the channel violated, the videos showed the girls in unusual situations that often involved gross-out food play and simulated vomiting.

In the previous months, YouTube has removed advertisments on more than two million videos and 50,000 channels.

Creators claim to be frustrated that YouTube is now withdrawing exploitative content it had previously facilitated and profited from.

‘What YouTube is basically doing is ignoring it until they can’t,’ Social Blade Operations Manager Danny Fratella told BuzzFeed News.

‘I think that once it got enough media traction they kind of had to do something.

‘YouTube has been favoring advertisers over creators recently and the advertisers got nervous and that’s the catalyst that drives the platform to take videos down.’

In November volunteer moderators revealed there could be as many as 100,000 predatory accounts leaving inappropriate comments on videos.

According to the BBC, some include the phone numbers of adults, or requests for videos to satisfy sexual fetishes.

The children in the videos appeared to be younger than 13, the minimum age for registering an account on YouTube.

‘There are loads of things YouTube could be doing to reduce this sort of activity, fixing the reporting system to start with’, an anonymous flagger said.

They estimated there were ‘between 50,000 to 100,000 active predatory accounts still on the platform’.

The investigation came just two days after YouTube announced a crackdown on sexualised or violent content aimed at ‘family friendly’ sections of YouTube.

 



Read more at DailyMail.co.uk