YouTube will no longer recommend conspiracy or medically inaccurate videos
Google-owned video sharing platform YouTube has made a change to its recommendations AI, and has announced that it will no longer recommend videos that come close to violating its community guidelines. Videos that will no longer be recommended include medically inaccurate videos and those putting forth conspiracy theories. However, the move will not affect the availability of such videos. Here's more.
The change had been announced at the end of January
YouTube had announced the change via a blog post on January 25, and recently, a former Google engineer who helped design the site's recommendation AI told NBC News that the change was a "historic victory". The engineer, Guillaume Chaslot, further said that the move was an important step towards the creation of more "humane technology", instead of technology that deceives gullible users.
How the change will affect the YouTube experience
In the blog post, YouTube announced that it had tweaked its recommendation AI to "pull in recommendations from a wider set of topics". Earlier, if a user watched a cookie recipe video for instance, they would mostly get suggestions for other cookie recipes. The recommendation AI worked similarly for conspiracy videos too, and pulled users down a rabbit-hole of conspiracy theories if they happened to watch one such video.
The kind of videos that will no longer be recommended
"...We'll begin reducing recommendations of borderline content and content that could misinform users in harmful ways—such as videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11," said YouTube.
The change will not affect availability of such videos
That said, the tweak in the recommendation AI will not affect the availability of conspiracy videos or medically inaccurate videos. YouTube said that such videos would continue to exist on its platform, and will be visible if users search for them. If some users subscribe to channels that, for instance, promote conspiracy videos, then they might also get related recommendations, but not otherwise.