AI
3 out of 10 US teens use AI chatbots every day
Nearly every US teen (97%) logs on daily, with about 40% saying they’re online almost constantly.
Just a heads up, if you buy something through our links, we may get a small share of the sale. It’s one of the ways we keep the lights on here. Click here for more.
The internet has long been the digital food court where teens hang out, and according to a new Pew Research Center study released Tuesday, that hasn’t changed much. It’s just gotten weirder. And more robotic.
Nearly every US teen (97%) logs on daily, with about 40% saying they’re online almost constantly.
Sure, that number is technically down from last year’s 46%, but compared to 2015, when only 24% said the same, today’s teens are practically living in Wi-Fi.
Meanwhile, governments are starting to panic. Australia is gearing up for a ban on social media for anyone under 16, while the US surgeon general is out here asking for cigarette-style warning labels on Instagram.
Yes, we have reached the “may cause emotional distress and chronic doomscrolling” stage of civilization.
Enter AI chatbots, the internet’s newest teen obsession. Pew found that about three in ten teens use chatbots every day, and 4% admit to using them almost constantly, which feels like the digital equivalent of having a crush on your calculator.
ChatGPT leads the pack with 59% of teens using it, trouncing Google’s Gemini (23%) and Meta AI (20%).
Some teens never touch them, but plenty check in weekly, especially older teens, higher-income households, and Black and Hispanic youth, who use the tech more often than their white peers.
But this isn’t just harmless “help me with homework” stuff. In rare but devastating cases, chatbots have crossed ethical guardrails, including lawsuits alleging that ChatGPT and Character.AI gave explicit instructions for self-harm to minors who later died by suicide.
One AI platform has already banned minors and switched to a safer choose-your-own-adventure format, presumably something that won’t tell you how to tie a noose.
Still, these tragic incidents represent a tiny fraction of interactions, but with platforms serving hundreds of millions, even “tiny” means over a million weekly conversations about suicide.
Experts say that even if the bots weren’t built to provide emotional support, teens are treating them like confidants, meaning tech companies may need to stop pretending their chatbots are just smarter search bars and start acting like responsible digital adults.
