Apps
TikTok will now give you a heads up if you land on a video it thinks contains misinformation
This isn’t really a feature, but hey, we’ll take it.

Just a heads up, if you buy something through our links, we may get a small share of the sale. It’s one of the ways we keep the lights on here. Click here for more.
Social media platforms have had it with misinformation on their networks, and now they are finding ways to help curb the spread. Facebook does it, and Twitter does too. TikTok now joins those ranks, as the company is rolling out a new feature that displays warnings on videos that contain misinformation that could not be vetted from a fact-checker.
If you happen to run across any of these videos, the post will display a warning label that says “Caution: Video flagged for unverified content.” This basically means one of TikTok’s fact-checkers watched the video but could not verify if any of the video’s information is right or wrong.
Naturally, TikTok will let the creator know the video they uploaded was flagged as unsubstantiated content. The weird thing is that TikTok will still let people share these videos, but they will add an extra step that asks if they are sure they want to share the video. If not, there’s an option to cancel too.
It’s unclear as to how TikTok fact checks its videos and the type of content they target. The company does not say how many videos it fact checks in a day, but they told The Verge, “fact-checking is often focused on topics like elections, vaccines, and climate change and that a video doesn’t have to reach a certain popularity to qualify for review.”
Editors’ Recommendations:
- Italy blocked TikTok after a 10-year-old girl died doing a “blackout challenge”
- TikTok users under the age of 16 will now have their accounts set to private by default
- The iPhone 12 Pro LiDAR camera finally has a use case – TikTok filters
- A 12-year-old from England has been granted anonymity to possibly fight TikTok in court
Follow us on Flipboard, Google News, or Apple News
