Graveyard Shift The Creepiest Things That People's Smart Home Systems and Alexas Have Ever Done  

Sarah Blumert
314 votes 84 voters 2.1k views 10 items

List Rules Vote up the creepiest smart home system anecdotes.

Technology hurtles forward at a startling pace. If you’ve ever heard someone complain that they just replaced their cell phone only to discover that a new model came out the day after, you know the frustration and often confusion that accompanies rapid technological progress. One fairly recent trend in consumer technology is smart home systems. These devices can dim your lights, adjust your thermostat, and even play any song imaginable with simple vocal commands. As such gadgets become more common, so do weird smart home stories that will leave you wondering if you really want to bring such a device into your abode. 

One of the most popular home systems is the Amazon Echo. Echo's AI assistant, Alexa, is well-known for her intuitive but often odd behavior. Creepy Alexa stories have become a staple of social media. While many customers rave about the convenience of having Alexa in their house, some have stories detailing her more eccentric moments that range from light and humorous to utterly chilling. Alexas have done some questionable things, but they're far from the only scary smart home tech out there. From hack-able baby monitors to devices that record you at random, you'll find some true horror stories below. Whether or not you own one of these devices, these tales will give you pause before you blindly plug yourself into the expansive Internet of things.

Study Finds That Smart Home Sy... is listed (or ranked) 1 on the list The Creepiest Things That People's Smart Home Systems and Alexas Have Ever Done
Photo:  Amazon
1 33 VOTES

Study Finds That Smart Home Systems Can Hear Messages That We Can’t

As soon as we began inviting what is essentially AI into our homes, there was talk of the potential for widespread abuse. One study covered in the New York Times in May 2018 showed the alarming capabilities of Alexa. The article describes a study conducted by students from UC Berkeley and Georgetown University in which they discovered that they could hide commands in white noise or YouTube videos to get the device to do things like open a website or turn on airplane mode.

As the frequencies used to communicate these commands are outside of a human’s hearing range, commands could even be embedded into recordings of music or spoken text. While the devices themselves are not responsible for these hidden messages, the study does demonstrate how anyone with an unsavory agenda could exploit the accessibility of someone’s home system. One of the authors of the study stated, “My assumption is that the malicious people already employ people to do what I do.”

31 2
Did this creep you out?
High-Tech Baby Monitor Sings t... is listed (or ranked) 2 on the list The Creepiest Things That People's Smart Home Systems and Alexas Have Ever Done
Photo:  Your Best Digs/Flickr/CC BY 2.0
2 50 VOTES

High-Tech Baby Monitor Sings the Worst Lullabies Ever

In October of 2014, two Cincinnati, Ohio parents were startled to hear some unexpected sounds coming from the Internet-enabled baby monitor in their 10 month old daughter’s bedroom. They awoke at midnight to the sound of a man screaming and at one point even yelling, “Wake up, baby!” They discovered the voice was coming from the monitor itself. 

Upon checking the monitor’s camera from a cell phone, they saw that it was moving independently. When the father ran into his daughter’s bedroom to check on her, the camera turned directly towards him and began yelling obscenities until he unplugged it from the wall. 

Tech experts claimed that wireless IP cameras such as the one that the family used are fairly easy to hack into, leaving many consumers feeling unsafe and violated.

45 5
Did this creep you out?
Alexa Makes Unprompted Sounds ... is listed (or ranked) 3 on the list The Creepiest Things That People's Smart Home Systems and Alexas Have Ever Done
Photo:  BestAI Assistant/Flickr/CC BY 2.0
3 40 VOTES

Alexa Makes Unprompted Sounds That Amazon Doesn’t Even Recognize

There have been many accounts of Alexa bursting into unprompted laughter or spouting out random sentences, but sometimes her contributions skew a bit more to the horrifying end of the spectrum. In a New York Times article published in February 2018, Farhad Manjoo shares the story of how his Alexa terrified him and his wife as they were settling into bed one night.

With no prompting from either person, Alexa’s blue ring lit up, indicating that she had heard her “Alexa” command. She then proceeded to emit what Manjoo described as a “wail, like a child screaming in a horror-movie dream.” After hearing about the incident, an Amazon employee offered to investigate, but clarified that they had never heard of Alexa doing such a thing. In fact, they didn’t even think Alexa was capable of making that kind of sound. 

Needless to say, Manjoo and his wife were unnerved, but elected to keep Alexa for the convenience, despite her occasional nightmarish conversation starters.

33 7
Did this creep you out?
Alexa Sends a Family’s Private... is listed (or ranked) 4 on the list The Creepiest Things That People's Smart Home Systems and Alexas Have Ever Done
Photo:  coniferconifer/Flickr/CC BY 2.0
4 31 VOTES

Alexa Sends a Family’s Private Conversation to a Contact 176 Miles Away

A May 2018 news article reported that a Portland family's Alexa recorded private conversations within their home. Then, without being asked or notifying the family, it sent these conversations to one of their contacts in Seattle. The contact who received the audio was an employee of a family member. After receiving the audio, she immediately contacted them and advised them to “unplug your Alexa devices right now.”

Upon being contacted about the incident, Amazon apologized profusely to the family and released a statement detailing what they believed took place, saying, "We investigated what happened and determined this was an extremely rare occurrence.” Even so, the company refused to offer the family a refund for their devices. 

27 4
Did this creep you out?