YouTube might have removed 8.3 million offensive videos, but bestiality still runs rampant
A disturbing trend continues to make waves through YouTube.
YouTube recently boasted that the service removed over 8 million offensive videos from their platform – including pornographic material, spam, and other types of videos that went against their policies. Over 6.5 million of these videos were removed through the use of AI, with the rest being removed by members of the team assigned to the task. Even with those impressive numbers, however, it looks like it is still relatively easy to find the dark side of YouTube.
This was made apparent after both Buzzfeed and Business Insider found disturbing, easily accessible thumbnails of videos after innocent search terms. Many times, there was absolutely nothing wrong with the videos, but content farms are specifically making thumbnails suggestive (or downright offensive) to draw in curious minds. These content farms come and go, many being banned, but not before the damage is done.
A YouTube spokeswoman told Business Insider,
“These images are abhorrent to us and have no place on YouTube. We have strict policies against misleading thumbnails, and violative content flagged to us by Buzzfeed has been removed. We’re working quickly to do more than ever to tackle abuse on our platform, and that includes developing better tools for detecting inappropriate and misleading metadata and thumbnails so we can take fast action against them.”
We will not go into the details of the search terms or the images associated with them, but for more information on that, you can check out the Buzzfeed and Business Insider stories linked above.
The real question here is how can a website that hosts endless videos keep up with these disturbing trends? AI and image recognition can go a long way, but obviously, that simply isn’t enough at this time.