Before the Internet of Things (IoT) boom, technology futurists breathlessly made predictions like, “Imagine your refrigerator can tell you when you’re out of milk!” That’s a good thing, right? Maybe. Maybe not. As we have started to live with “smart” devices like Alexa and life-simplifying apps on our phones, the ugly potential for these technologies is now emerging.
Domestic abusers and stalkers are finding new avenues to terrorize their victims using devices and software that was supposed to provide convenience and, ironically, greater safety. As the New York Times reported recently, IoT products like smart thermostats and security cameras are now becoming vectors of control and abuse. The problem is challenging to address, and a good deal more complex than it appears at the outset. Still, solutions are on the horizon.
IoT Tools of Abuse and Control
Domestic violence treatment professionals and advocates are now seeing misuse of IoT devices for the purposes of controlling and tormenting another person. Examples include abusers tracking the locations of their victims using (sometimes hidden) GPS apps, monitoring behavior remotely using security cameras and “gaslighting” through unpredictable changes in home temperature and the like.
It’s a frightening prospect for the survivor of the abuse, but not a particularly new scenario, according to several experts. “Domestic abuse is about controlling the other person,” said Rachel Gibson of the National Network to End Domestic Violence (NNEDV). “It used to be, the abuser would check your odometer and grill you about where you’d been in the car. Now, they look at your GPS. It’s the same behavior, just updated with modern technology.” To help survivors become more aware of the risks of technology, NNEDV publishes the website techsafety.org.
“It used to be, the abuser would check your odometer and grill you about where you’d been in the car. Now, they look at your GPS. It’s the same behavior, just updated with modern technology.”
Old or new, it’s still problematic. As Ruth Glenn, Ruth M. Glenn, President of the National Coalition Against Domestic Violence (NCADV) explained, “Those that need to control will often go to extremes.” In her view, technology makes it easier to get to an extreme of abuse very quickly.
Publicized incidents are re-traumatizing survivors, too. Leslie Morgan Steiner, author of the book Crazy Love, which discusses why domestic violence victims stay in abusive situations, commented, “The recent growth of devices that allow people to listen and monitor their homes remotely are a big concern to me as a domestic violence survivor and advocate. I’ve been hearing stories from current victims about this technology increasingly being used to install fear in loved ones, to make false accusations about what they do at home in their free time, and to dominate and control them psychologically.”
To Steiner, the technology also fuels abusers’ paranoia and drives them to give into their obsessive and unrealistic desire to control and judge every aspect of a partner’s behavior. She added, “What disturbs me most as a former victim myself is that home is where you should feel safe and relaxed, and these monitoring tools instead make victims feel anxious and terrorized, and even more afraid of making a safety plan to end the abusive relationship.”
The Risk of Stalking through the IoT
Remote stalking by a stranger is another disturbing and all too real consequence of the proliferation of IoT devices in the home. Yotam Gutman, VP of Marketing of SecuriThings, a company whose technology prevents abuse of IoT devices, described two basic IoT stalking scenarios. In one case, there is a semi-stranger, perhaps a work acquaintance who hacks into home devices in order to spy on an individual. This behavior may be part of a psychological fixation (e.g. erotomania) where the stalker imagines he or she is having a relationship with the other person—who in all likelihood has no idea of what is going on.
The other (unfortunately) common situation is for a complete stranger, like a security company technician, to use IoT technology to eavesdrop, watch and potentially manipulate a victim. In this case, the victim could be hundreds of miles away and of course, completely unaware of the illicit surveillance.
Dealing with the IoT Abuse and Stalking Threats
There are a number of ways to detect and prevent the misuse of IoT devices for the purposes of abuse and stalking. Their efficacy is uneven and somewhat dependent on the individual’s level of commitment to solving the problem.
Awareness is a key first step. This is critical in the experience of Susan Moen, Executive Director of the Jackson County, Washington, Sexual Assault Response Team (SART). “It comes up a lot, but people don’t want to believe this is happening to them,” she explained. “Plus, they may not understand the technology very well, and to be honest, who does?”
Moen and her team counsel victims of domestic violence, stalking and sexual assault to keep track of their technological exposure. “For example,” she said, “Are you experiencing what we call ‘social leakage’? Are you telling your sister you’re going to a party, which she posts on Facebook and, in the process, accidentally invites your abuser? It’s not just devices. It’s the complete social media and technology fabric of your life. You’re exposed and you need to figure out where you’re giving information to your abuser.”
“Are you experiencing what we call ‘social leakage’? Are you telling your sister you’re going to a party, which she posts on Facebook and, in the process, accidentally invites your abuser?”
Ruth Glenn has had a similar experience counseling people in technology-driven abuse situations. “When you start to disentangle yourself from a life partner, in tech terms, it’s mind boggling how many different accounts and devices you have to deal with,” she observed. “You have credit card accounts, phone contracts, cable TV, Internet, Wi-Fi, home devices, home security systems, financial accounts and on and on — each one of these can become a way of controlling and abusing another person.”
Leslie Morgan Steiner struck a hopeful note in this context, saying, “There is an upside to the technology, in that in-home monitoring can also be used to record abuse, which in the long run can be used to hold abusers responsible for their emotional and physical violence. Too often, abuse is challenging to prosecute because it is misinterpreted as a he said/she said crime. This technology in effect can be used as a witness to the violence, thus aiding police and judges trying to hold an abuser responsible.”
“Too often, abuse is challenging to prosecute because it is misinterpreted as a he said/she said crime. This technology in effect can be used as a witness to the violence, thus aiding police and judges trying to hold an abuser responsible.”
The legal remedies may not be as sound as one might imagine, however. Paul Gelb, Esq., a Los Angeles-based attorney who specializes in data privacy law, highlighted the complexities and challenges involved in making a legal case against an abuser who uses IoT devices. In Gelb’s view, though there are statutes that help victims, they can be tricky to apply. Revenge Porn laws are useful, but limited. Or, for example, misuse of a listening device might constitute a violation of the Federal Wiretapping Act. A person has a “reasonable expectation of privacy” in his or her home. However, as Gelb pointed out, the Federal Courts have ruled that the Federal Law for wiretapping seldom, if ever, can apply to domestic conflicts. The government is reluctant to make spousal recordings with no consent into a federal crime.
One issue that comes up in pursuing such cases, as Gelb noted, is the definition of “consent.” For instance, if a victim does not change the password to a device that the abuser has access to, does this imply some sort of consent for the abuser to use it? Alternatively, it is possible to argue successfully that changing a password is effectively denying consent for the abuser to access the device.
Leveraging Artificial Intelligence to Detect IoT Abuse
One of the most basic problems in dealing with IoT abuse comes from the sheer scale of the install base. There are millions of listening devices, cameras, sensors and actuators in people’s homes today and the number is only going up. And, according to Gutman of SecuriThings, most of these products were not built with security in mind. They’re easy to hack.
SecuriThings uses Artificial Intelligence (AI) and machine learning to analyze IoT device usage and behavior across very large deployments. For example, they can monitor access logs to a million security cameras and detect anomalies that might indicate abuse. A human observer would never be able to correlate the activities that signal abusive behavior. Only a machine can do it. The process is based on a software “agent” they install on each device. It tracks use and reports back to the SecuriThings database in the cloud.
They can monitor access logs to a million security cameras and detect anomalies that might indicate abuse.
They found, in one case, a set of cameras that were being accessed from the same remote location dozens of times. It turned out that an employee of a service provider was improperly using the cameras to watch people in their homes. They can also find malware and other malicious misuse of devices.
It appears that the world is only in the early stages of confronting the risks of IoT misuse. Everyone must now catch up: the law, advocates and counselors and of course, technology. The risks may increase in scale, too, as immense volumes of private data from home devices accumulates in the cloud and on other less-than-secure platforms. SecuriThings and their peers in the cyber security industry are already exploring solutions to these challenges.