News

Actions

YouTube says it will crack down on recommending conspiracy videos

Posted at 3:42 PM, Jan 25, 2019
and last updated 2019-01-25 15:42:17-05

YouTube is making changes to its recommendation algorithm, which serves up new videos for users to watch, in an effort to crack down on the spread of conspiracy theories on its platform.

In a blog post on Friday, the Google-owned company said it would start reducing its recommendations of “borderline content” and videos that may misinform users in “harmful ways.”

YouTube is making changes to its recommendation algorithm, which serves up new videos for users to watch, in an effort to crack down on the spread of conspiracy theories on its platform.

“Borderline content” includes videos featuring fake miracle cures for serious diseases, claiming the earth is flat and making blatantly false claims about historic events such as 9/11, according to the company. It did not provide further examples. Such content doesn’t violate YouTube’s community guidelines, but the company says it comes close.

YouTube has long faced criticism for allowing misinformation, conspiracy theories and extremist views to spread on its platform, and for recommending such content to users. People who came to the site to watch videos on innocuous subjects, or to see mainstream news, have been pushed toward increasingly fringe and conspiracist content.

Related: YouTube, Apple and Facebook remove content from InfoWars and Alex Jones

This month, a Washington Post investigation found a YouTube search for Supreme Court Justice Ruth Bader Ginsburg’s initials “RBG” turned up videos from the far-right — some of which falsely alleged doctors are keeping her alive with illegal drugs – which outnumbered results from reliable news sources. A Buzzfeed News investigation on Thursday found that after news-related searches, YouTube suggested conspiracy videos and content from hate groups. After a mass shooting at a high school in Parkland, Florida last year, a top trending video on YouTube suggested survivor David Hogg was an actor.

YouTube has also faced backlash for running ads on extremist content. A CNN investigation last year found ads from over 300 companies on YouTube channels promoting white nationalists, Nazis, pedophilia, conspiracy theories and North Korean propaganda.

YouTube said Friday’s move would apply to less than 1% of content on its platform. However, according to YouTube users watch one billion hours of video on the platform each day — so even that small percentage is significant.

These videos will still be available on YouTube, and may still appear in search results. They could also continue to be recommended to users subscribed to channels posting such content.

“We think this change strikes a balance between maintaining a platform for free speech and living up to our responsibility to users,” the company said.

YouTube also noted this would be a gradual change and will only impact recommendations of a “very small set” of videos in the US. It will roll out the change to more countries as its systems improve, it said.

YouTube has taken other steps to address conspiracy videos in the past. Last year, it announced it would begin displaying text boxes called “information cues,” which link to Wikipedia and other third-party sources to discredit a hoax.