In late September,between dozens of reporters and a screen displaying a Twitter message from a customer: "I have privacy concerns surrounding your Alexa device."
Limp affected a somber tone. "We care about this," he said. And I believe him. Tech giants like Amazon do care. Their business models depend on data collection, and the optics of events like September's are critical to preserving such models.
Policymakers and advocates of all stripes are fighting to keep your data private, but it feels like a losing battle. Even California's Consumer Privacy Act -- the in the US as of 2020 -- only protects consumers part way: Companies must inform California citizens of their intent to collect data and they must comply with customer requests to delete said data. But after protracted legislative sessions and heartfelt "We care" statements professed on stage, tech giants like Amazon will continue to wring vast quantities of money from the numbers we freely give them.
The solution might not be arguing harder, but rather refining the terms of the argument. Democratic presidential hopeful Andrew Yang recently proposed a seemingly radical idea: Data is a commodity harvested from users, but only corporations can monetize it. Why shouldn't we profit from that same data? We should begin to think less, Yang says, in terms of privacy and more in terms of property.
Is a change in verbiage really what we need? Yes and no.
Data: Our lives, by the numbers
The rationale for Yang's approach to data -- that is, our "likes," our queries, our reading, our political leanings -- is based on the dollar value it represents for tech companies.
It's hard to understate how quickly corporate actors are expanding their ability to monetize our data. The same day Limp made assurances about privacy upgrades, he introduced a new Alexa feature that monitors users' emotional state (dubbed Frustration Detection) and a product called the. The former announcement represents yet a new source of data -- a bit like fracking, only using errors to extract frustration rather than water to extract oil -- but the latter product unveiling seems banal enough, right?
Sure, a clock on your Echo looks nonthreatening, but the simple data driving its design is noteworthy: Over the past year, Amazon has observed a billion instances of users asking Alexa the time of day, and thus, Amazon is confident the Dot with Clock will sell well. Turns out, it's easy to create supply when you can not simply predict demand, but freely measure it.
Uber provides an even more direct example of profiting from demand curve data. The ride-hailing company not only tracks pricing at various times of day, but also the limits of your willingness to spend -- tracking, for instance, when you check a rate and decline to order a ride. Thus, Uber can optimize its pricing in real time for peak profit, no estimation needed.
The types of data companies can gather is also growing. Google, for instance, owns seven products -- seven! -- that each boast over a billion monthly users. And for each product any given customer uses (say Gmail, an Android operating system, the Chrome browser and Google Search), Google can form a more complete profile of their behavior. That profile directs product development, but more importantly, it shapes advertising.
Advertisement or something else?
Generally, advertising has always been contextual -- when you walk into a movie theater, you see trailers for movies. When you drive on the highway, you see billboards for hotels. But data collection has given rise to cross-context behavioral advertising, sometimes called targeted advertising. If you Google news about the latest iPhone, for example, ads for smartphones might show up on your Facebook feed two days later.
But in the age of personal data brokering, ads are much more concerned with shaping human behavior than they are with informing customers of useful products. Think about the marketing ecosystems that build excitement or "hype" for new products, movies, video games and so on. It's symbiosis: The latest Marvel movie profits from all the websites covering it (), and the websites profit from the movie; and the whole system propels would-be customers toward consumption.
It's more dangerous than whether you buy a ticket to Spider-Man or not. Advertising based on IP address, device and account, as one Forbes writer recently put it, "atomizes us as consumers and further forces us to atomize ourselves as we are separated from our community as cultural referent." Such advertising alienates us from each other and our shared cultural concerns. What's more, corporate entities are directly and indirectly invested -- in the form of targeted ads -- in influencing citizens' perceptions and beliefs, even at the most personal levels.
Take my father for instance, a man in his mid-60s and happily married for over 35 years, who recently began receiving targeted ads for divorce lawyers and dating services. He had briefly checked timelines in the history of divorce law for a college course he was teaching (he's a cultural anthropologist). He was surprised by how such a neutral search could prompt such motivated ads, if only because divorce is much more profitable for law firms than his reaching that next marriage milestone.
By the same mechanisms, ads for high-interest loans regularly bombarded me and other classmates while we struggled to pay bills and freelance in grad school.
These advertisements are about high-stakes issues, ranging from the intimate, like family structure, to the fiscal, like debt, to the political, like policy referenda (think:). In other words, Russian deepfakes are the tip of an all-American iceberg of propaganda written by biased parties and disseminated for profit by tech giants like Facebook and Google -- all bought and paid for with your data.
The order of that list is telling.
Yang's shift from privacy to property
Georgetown Professor Mark MacCarthy has laid out the various problems. Here are a handful:
1. The problem of co-ownership. Humans are social, and much of our data is shared with others. If two people have a conversation, for instance, whose property would it be?
2. The problem of consent. Even if our data were to be treated as property, that wouldn't guarantee its protection. Once sold or given, customers would forfeit claim on it, regardless of its changing hands or changing uses.
Perhaps most importantly, data is slippery: it has different values for individuals, companies and society writ large. As Will Oremus at tech and science publication OneZero observes, "How much [would you] be willing to pay for your drinking water to be kept poison-free … No dollar amount can perfectly reflect the value of data protection to citizens, and no reasonable dollar amount can deter companies for whom data is an inexhaustible gold mine."
More to the point, Yang's actual policy suggestions don't treat data like property. He proposes the rights for you to be informed of data collection and use, to opt out, to be told if a website has data on you, to be "forgotten," to be informed if your data changes hands, to be informed of data breaches and to download all your data to transfer it elsewhere. Data isn't treated here as a commodity to be bought and sold, to be owned solely by one party at a time, but rather as information for which we may exercise a right to privacy. It's a more robust version of California's bill -- not the radical (and wrongheaded) shift toward treating data as property it claims to be.
Do these words matter at all?
Though Yang's suggestions aren't exactly what they seem at first blush, thinking more elastically about privacy issues is vital to the conversation. It's important for everyday citizens -- not just journalists, lawyers, professors, presidential hopefuls and-- to have opinions about privacy.
Perhaps the biggest barrier to policy solutions right now is information specialization: We think, I'm not a statistician or advertiser or security analyst, so I can't contribute to the discourse on data collection. But each of us is involved intimately with this conversation. Alexa is literally starting to pay attention when we're in a bad mood, after all. This "data" everyone keeps talking about is simply our lives, measured. Constitutionally, it might be hard to nail down whether that should be treated as property, a privacy concern or as some writers are compellingly arguing, a human right -- but ethically, Yang is correct that we have a claim to it. The question we should all keep asking and attempting to answer is, "How can we make that claim?"