There is a fairly popular topic on the internet at the moment, after YouTube said that videos of conspiracy theories may stop reaching a large part of users. This change in the algorithms of the platform would not allow them to appear among the recommendations.
According to the statement, it is reorganizing its recommendation algorithm, which recommends new videos to users to prevent intrigue and false information, reflecting the rising eagerness to put down the world’s largest video platform after several public errors.
In a blog post, YouTube declared: “We will begin to reduce the recommendations of borderline content and everything that could misinform users in a harmful way, such as videos promoting a false miraculous cure for a serious illness, affirming that Earth is flat or make blatantly false claims about historical events like 9/11.”
As these videos do not break the terms and conditions of the service, so they will not be removed from the site. Additionally, users can carry on uploading such type of content, with being able to find them within the channels that broadcast material as well by searching it.
“We believe that this change strikes a balance between maintaining a free speech platform and fulfilling our responsibilities to users,” the post said.
The action came after YouTube acknowledged several complaints that it was automatically adding conspiracy and related videos to playlists, as well as showing in the recommendations on the homepage.
Since 2016, the platform has also built-in contentment, preferences, dislikes and other indicators into its recommendation system. But from the mainstream video, the algorithm often turns sharply to suggest radical ideas.
The recent act will affect “less than one percent” of the content of the site. For now these changes will only be executed in the United States, though, it is already considered to be implemented globally.