Skip to main content

YouTube to reduce recommendations for content that could "misinform in harmful ways"

YouTube has come under fire for its recommendation algorithms that push users towards more and more extreme content. Start out watching a news story, let YouTube's autoplay run and just a few videos later you may end up deep in conspiracy theories. Search for health information and YouTube may suggest videos promoting miracle cures and snake oil. If all you care about is total watch time, this probably works great. But it does end up promoting false information and push viewers towards extremism.

Today YouTube announced a step towards remedying that. They will be reducing recommendations of videos that almost (but not quite) violate the Community Guidelines or that could "misinform users in harmful ways".  Examples include videos "promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11."

This is in addition to policies against misleading titles, tags, and thumbnails, and the recently announced increased policy enforcement around external links and thumbnails that violate the Community Guidelines.

The details:

  • This will not affect viewing the videos, only recommendations
  • Videos may still appear in recommendations for channel subscribers and search results
  • This applies to less than one percent of YouTube content
  • Initially it will apply to a small subset of videos in the United States, but likely will be expanded in the future
  • YouTube will use a combination of machine learning and human review to determine which videos will be affected.

Read more about this change

Comments