Sick bestiality images are appearing on YouTube despite the site’s promise that it has improved its moderation efforts.
Videos that use thumbnails of women who seem to be engaging in sexual acts appear on banal search phrases like ‘girl and her horse’.
A number of the thumbnails have been on the website for months.
Uploaders have dodged the platform’s AI filters and human moderators by only using graphic content in preview ‘thumbnail’ images that appear in search results.
An unnamed YouTube employee told BuzzFeed that the platform’s thumbnail monitoring tech still isn’t as powerful as its video algorithms.
YouTube videos showing sick images of women performing sexual acts on horses and dogs appear in the platform’s top search results, according to a new report. Some of these videos have millions of views and have been on YouTube for months (stock image)
The shocking discovery was uncovered in a new investigation from BuzzFeed News, which reported that images of bestiality are extremely ‘easy to find’ on YouTube.
The report found that creepy but non-explicit videos featuring scantily clad women petting or grooming animals were advertised using explicit bestiality thumbnails.
A large number of thumbnails featured women engaging in sexual acts with horses.
YouTube thumbnails appear beneath the play button in search or ‘related videos’ results and are intended to give potential viewers a preview of what the video shows.
Channels using thumbnails containing bestiality are ‘advertising’ their videos with explicit imagery, but YouTube has repeatedly failed to take the clips down.
The videos appear when searching banal phrases like ‘girl and her horse’, sparking concerns that children could access the explicit images. Uploaders have dodged the platform’s moderation systems by only using graphic content in preview ‘thumbnail’ images (stock)
The report found that searching YouTube for ‘girl and her horse’ returns more than 12 million results.
Among the first 20 results were four videos marketed with thumbnails of women engaged in sexual acts with horses.
The top result for that search query was a video promoted by a half-blurred thumbnail of bestiality.
The clip, titled ‘Fantastic Girl and Her Horse in My Village’, had amassed 35,000 views in the four weeks it had been on YouTube.
While the video itself contained no bestiality, clicking on it prompted YouTube’s recommendation algorithm to suggest dozens more animal videos.
Many of these had thumbnails featuring graphic bestiality, including one published by a channel called ‘ALL ANIMAL’ that had amassed 2.3 million views.
In response to the report, a YouTube spokesperson told MailOnline: ‘These images are abhorrent to us and have no place on YouTube.
‘We have strict policies against misleading thumbnails, and violative content flagged to us by Buzzfeed has been removed.
‘We’re working quickly to do more than ever to tackle abuse on our platform, and that includes developing better tools for detecting inappropriate and misleading metadata and thumbnails so we can take fast action against them.
‘In the last quarter of 2017 we removed over eight million videos for violating our policies and we will continue to invest heavily here to tackle this problem head on.’
YouTube announced in December 2017 it would hire 10,000 extra human moderators people to monitor videos amid concerns too much offensive content was making it onto the site.
Susan Wojcicki, the chief executive of the video sharing site, revealed that YouTube enforcement teams had reviewed two million videos for extremist content over the preceding six months – removing 150,000 from the site.
Around 98 per cent of videos that were removed were initially flagged by the ‘computer learning’ algorithms.
Almost half were deleted within two hours of being uploaded, and 70 per cent were taken down within eight hours.
Ms Wojcicki said: ‘Our goal is to stay one step ahead, making it harder for policy-violating content to surface or remain on YouTube.
‘We will use our cutting-edge machine learning more widely to allow us to quickly remove content that violates our guidelines.’