Commentary: Do you really want Alexa or Google Assistant to hear you whispering two rooms away?
An Amazon Echo is across the room, and I'm playing a recording of the wake word "Alexa" over and over again at various decibel levels. It's a relatively simple test -- one that I've performed in some shape many times over the years -- and I'm consistently impressed. Smart speakers like Amazon's growing roster of Echos and Google's Home and Nest devices, it turns out, are really good at listening to you. Like really good.
In testing the newest Echo Dot, for instance, I could stand 30 feet away and speak at a low level, and Alexa would still respond. But could a better design make a smart speaker worse for the customer?
There's not a simple answer to that question, but it's a question we need to be asking nonetheless -- and not just about smart speakers.
Read more: Amazon Echo, Google Nest and all the best smart home gifts of 2019
Advancing technology gives rise to all sorts of ethical concerns, but smart home tech is uniquely positioned. It lives in our most intimate spaces, where we eat, where we talk with family, where we dress, where we sleep. Sure, your Nest Mini's mic and your Echo Show's camera can be disabled, but the tools of surveillance are still there.
We're already wading into muddy waters, ethically speaking, what with Amazon and Google recording queries (including conversations when Alexa mistakenly "heard" a wake word). It's not just smart speakers, either. Smart cams have begun incorporating facial recognition technology, and Ring video doorbells in particular are being used by police forces to create surveillance networks in neighborhoods.
The cameras and microphones we willingly bring into our homes are creating the conditions for a privacy-denuded society. But for many, the services rendered are worth the risk incurred. If you don't plan on robbing houses in your neighborhood, after all, why object to police monitoring your (or your neighbor's) Ring camera feed?
A few years ago, I interviewed cybersecurity experts at a number of top universities, including Harvard and Johns Hopkins. As I talked with them, trying to get a basic handle on the security concerns in a burgeoning smart home market, they seemed increasingly agitated. Finally, one researcher interrupted me, saying my questions were the wrong ones to ask.
The risk with smart home devices, he said, wasn't that burglars might hack your smart lock to break into your house. The risk was that hackers could exploit tens of thousands of poorly secured smart home devices across the country for all sorts of purposes -- purposes that potentially wouldn't affect their owners on an individualized level. In other words, that camera, printer or smart fridge probably won't directly jeopardize your personal safety, but it might well degrade the security of banks, web services or other large institutions the world over.
Only a few months after conducting those interviews, the Mirai botnet attack of 2016 happened, using over 300,000 devices to crash Twitter, Netflix, Reddit, Pinterest and several other sites in a DDoS (or distributed denial of service) attack. The tools of the attack: thousands of printers, off-brand smart cams and other smart home gadgets.
The device insecurity that enables DDoS attacks is different from large-scale privacy degradation, but the consumer mindset fueling it is the same: We as a society are more concerned with the lifestyle benefits of a $50 off-brand smart cam than the threat such a device poses to economic and political stability. We are happy to provide governments and corporations unfettered access to our private lives via microphones and cameras, not to mention Google Maps tracking our locations or Starbucks analyzing our purchasing behaviors, because we feel only the convenience of those products and services. Meanwhile, the privacy that undergirds democratic freedom -- of political dissent, reputation management and exploration of ideas (popular or not), to name only a few -- slowly deteriorates.
Before I'm dismissed as radical or unrealistic, I should admit that I own an Echo Dot and I use Google Maps almost daily. But I have taken the practical steps of using Duckduckgo instead of Google, deleting many of my "rewards" apps and keeping my Echo on mute anytime I'm not using it. Seeing as policy-makers range from somewhat competent to shamefully inept when it comes to understanding platforms like Facebook and Instagram, maintaining privacy norms in American society might come down to our individual willingness to inconvenience ourselves.
As I stand across from a smart speaker, playing a low-volume recording of its wake word, I have to ask myself as a reviewer whether a better microphone makes a better smart home device. Do we really want Google Assistant to be able to hear us whispering two rooms away? Do we want Alexa to know when we're in better or worse moods? Do we want the Facebook Portal to be able to follow our faces around a room?
Sure, smart speakers have yet to be used, to our knowledge, to surveil criminal suspects or anyone else. But as companies gather more data than they claim and police and government officials conduct meetings about surveilling citizens in private settings, the phrase "the privacy of our own homes" seems to mean less and less. Try as they might, developers can't disentangle their creations from the ways those devices are used. Smart home tech is as ethically neutral as a loaded gun. Sure, a gun won't shoot anyone on its own. But so many hands are fumbling to disengage this gun's safety, it won't be too long until one finds itself pulling the trigger.
Originally published Oct. 31, 2019.