Connect with us

News

Research points the finger of blame at YouTube for the rise in Flat-Earthers

Get your grandma off the internet, stat.

Tesla in space with earth in background
Image: SpaceX

While YouTube has taken some worthwhile steps to combat disinformation, conspiracy theories and other unsavory rabbit holes set in motion by the recommendation algorithm, it appears to be a case of “too little, too late.” Researchers have discovered a link between the rise of Flat Earthers and conspiracy videos hosted (and promoted) by the video site.

Now, just a word of warning – this study has a fairly limited number of interviewees so it might not be accurate when scaled up. That said, the team from Texas Tech University spoke to 30 Flat Earthers at the recent Flat Earther’s yearly conference in Denver, Colorado, where “650 people from around the world” met to discuss their theory. All but one of those people interviewed said that before they saw YouTube videos promoting the idea it wasn’t something they had considered.

Oh, and the one person that didn’t say it was YouTube? He was there with his daughter and son-in-law and they told him about it after they watched it on YouTube. Looks like those researchers are batting a thousand to me.

How do people find these videos?

The gateway to these outlandish, science-denying videos? Other conspiracy theory videos such as alternative 9/11 theories, fake moon landing videos and Sandy Hook gaslighting videos. The algorithm is designed to keep people watching to play more ads and in turn, this leads to other conspiracy theories, like the flat earth theory.

Asheley Landrum, the assistant professor at Texas Tech University that ran the study says that while YouTube isn’t technically doing anything wrong, it could be doing more to protect its users. Landrum states, “Their algorithms make it easy to end up going down the rabbit hole, by presenting information to people who are going to be more susceptible to it.”

Guillaume Chaslot, one of the ex-Googlers who worked on the recommendation algorithm, explained why conspiracy videos are promoted more than fact-based videos – that the AI’s dataset got biased by the super-active conspiracy community, leading to the AI recommending those types of videos to more users. Luckily, Google is working on that.

Surprised by these results or would you like to see a bigger study before drawing an opinion? Let us know down below in the comments or carry the discussion over to our Twitter or Facebook.

Editors’ Recommendations:

Follow us on Flipboard, Google News, or Apple News

Maker, meme-r, and unabashed geek with nearly half a decade of blogging experience. If it runs on electricity (or even if it doesn't), Joe probably has one around his office somewhere. His hobbies include photography, animation, and hoarding Reddit gold.

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Deals of the Day

  1. Paramount+: Live Sports Starting at $2.50/mo. for 12 Mos. Sports - Try It Free w/ code: SPORTS
  2. Save $20 on a Microsoft365 subscription at Best Buy with a Best Buy Membership!
  3. Try Apple TV+ for FREE and watch all the Apple Originals
  4. Save $300 on a Segway at Best Buy, now $699

More in News