Connect with us

News

Alexa and Google Assistant are dropping f-bombs and talking about cocaine

A subtle reminder that technology isn’t perfect.

alexa google home bad language errors
Image: Unsplash (edits: KnowTechie)

Amazon Alexa, Google Assistant, and other voice assistants seem to get smarter with each passing year. Unfortunately, this doesn’t mean they are perfect as some users have figured out, according to The Wall Street Journal.

Users are reporting strange occurrences with their smart speakers. Among the bizarre instances is the story of Rheganne Mooradian, 24, from New Mexico. Soon after quitting her job and crying one day, Mooradian heard her Echo Dot speaker explain,  “It’s going to be OK.”

Soon after and in a panic, she turned off the device.

“I unplugged her instantly and I literally ran downstairs and shoved her in a drawer,” said Ms. Mooradian, 24, who lives in Albuquerque, N.M. “I was just like, whoa, this is not normal. She’s not supposed to do that.

Does Google Home sling dope?

Wanda McDaniel, 63, had a similar experience with her Google Home Mini, which she received for Christmas.

In August, she heard Google Assistant announce an alarm for  “cocaine and reefer.” Her husband Calvin heard the message too and said, “I jumped up. What’s this, a dope deal?”

Watch your language, Alexa

Finally, Neva and Rick Sprung of St. Louis told The Wall Street Journal they were visiting family last winter when a man’s voice suddenly came from an Echo speaker, spewing expletives.

“It was very strange but it was ‘f—, f—, f—, f—,’” said Mrs. Sprung, 65. “There might have been some F-yous in there. It was just a straight effing rant.”

It’s difficult to determine what happened in all three of these cases. In the first, Amazon said it could offer tech support. Unfortunately, Mooradian declined and now only keeps her Echo Dot on when she’s using it.

The McDaniel’s situation occurred because Google Home had heard a pastor on television saying the words, “They lose their love for cocaine and reefer” while speaking about spirituality and addiction. Subsequently, the words “they lose” might have been picked up as “hey Google.”

Finally, in the case of the Sprungs, their Alexa history showed that Echo heard instructions to “play another person.” It chose a track called “Another Person,” which says the F-word multiple times.

The bottom line: Voice assistants are indeed smart, but they aren’t perfect. As such, you might hear something strange occasionally.

Has your voice assistant said something strange? Let us know below.

Editors’ Recommendations:

Bryan considers himself a well-rounded techie, having written articles for MakeUseOf, KnowTechie, AppAdvice, iDownload Blog. When he's not writing, he's being a single dad and rooting for his alma mater, Penn State, or cheering on the Patriots.

Comments

More in News