YouTube is a massive platform to say the least. It’s rich with all sorts of information, and inevitably some of them are less than accurate, to say the least. And Google has shed some light on the ways it deals with such content on the video hosting website.
The way the internet search giant explains it, Google follows what it calls the 4R framework. Half of this involves removing content that violates its community guidelines, and reducing their visibility. The other half consists of raising or promoting authoritative voices on information, and rewarding trusted content creators.
One example of this is the addition of the News shelves on the site. Whenever you’re browsing, or even searching for something specific, you’ll see a section that’s reserved just for this. This is seen even more frequently, now that the world is dealing with the COVID-19 pandemic.
Google also explains that less than 1% of all content on YouTube is what the platform considers harmful, and ultimately removed. The initial flagging is usually done using machine learning tech, before a human reviews the video and completes the process.
The company aims to reduce this number even further so that there’s less misinformation on its platform. And it’s especially important to keep people away from them now, with the world facing the COVID-19 pandemic.