X
CNET logo Why You Can Trust CNET

Our expert, award-winning staff selects the products we cover and rigorously researches and tests our top picks. If you buy through our links, we may get a commission. Reviews ethics statement

My own private memory hole

A European Commission proposal to give individuals a "right to be forgotten" is misguided, argues consultant Larry Downes. Data isn't tangible property.

Larry Downes
Larry Downes is an author and project director at the Georgetown Center for Business and Public Policy. His new book, with Paul Nunes, is “Big Bang Disruption: Strategy in the Age of Devastating Innovation.” Previous books include the best-selling “Unleashing the Killer App: Digital Strategies for Market Dominance.”
Larry Downes
5 min read

Editors' note: This is a guest column. See Larry Downes' bio below.

In "1984," George Orwell's classic dystopian novel, protagonist Winston Smith is a low-level bureaucrat in the Ministry of Truth. His job: to "rectify" old newspaper articles in which Big Brother's predictions or promises turned out to be false. Once the articles are rewritten, the original text--and the truth they represent--is dropped down a pneumatic tube known as a memory hole, "to be devoured by the flames."

The European Commission has recently proposed a real-life version of this fictional device, though this time with a twist. Twenty-five years after the events in Orwell's allegory took place, the Commission has announced plans to regulate what it calls an individual's "right to be forgotten." The new memory hole would be under the control not of Big Brother but of individuals--of all the Winston Smiths of the world.

But the citizens of Oceania--excuse me, the European Union--should be just as wary of this new approach to rewriting history. Erasing the truth is as dangerous as it is futile. And wrapping the effort in the flag of personal privacy only makes the effort appear naive or cynical. Or both.

The right to be forgotten
Under the new "Comprehensive Approach on Personal Data Protection in the European Union" (PDF), the Commission will undertake an extensive review of gaps in existing EU privacy law, proposing new legislation next year aimed at shoring up consumer rights and reducing conflicts in the privacy laws of EU member states.

The report surprised many with plans for new laws aimed at "clarifying the so-called 'right to be forgotten,'" which the Commission awkwardly defines as "the right of individuals to have their data no longer processed, and deleted when [the data] are no longer needed for legitimate purposes."

There are, of course, legitimate privacy concerns associated with new technologies. But existing laws ensuring customer control over user-supplied data such as digital photos, tweets, or tax preparation files stored with an online service are very different than a right to be "forgotten."

For one thing, under the EU's landmark 1995 privacy directive, any information that refers to or identifies an individual is considered "private" data. So a right to be forgotten may give individuals the right to demand, as the Commission puts it, "access, rectification, and deletion" of any or all information that identifies the individual, regardless of how or by whom it was collected.

The right to be forgotten may empower EU citizens to demand the suppression of any information that refers to them, including public records, newspaper reports, personal recollections, and other "private" information that wasn't supplied by the user in the first place. All of that information, under EU law, is "their" data.

Information--private or otherwise--is not property
Such a sweeping right could extend well beyond computer data. "Clarifying" the right to be forgotten could include the right to demand the destruction of paper records as well--even copies in the possession of other individuals.

Taken to its logical extreme, a true right to be forgotten would prohibit me from repeating, even in conversation, any personal facts about you I happen to know. It might even require me to purge my mind of anything about you I remember--literally to forget you.

Medical science doesn't currently support such a remedy, but that limitation hints at the bigger problem with empowering individuals to erase personal facts that have already entered the collective conscience of others. A right to be forgotten begins by assuming that privacy is a kind of hidden possession of the individual, one that the "original" owner can later reclaim, even if it's been lost, sold, traded, or stolen--indeed, even if the individual never had it in the first place.

That's a fatal assumption. Information, personal or otherwise, is not property, at least not in the same way that a house, a barrel of oil, or a cup of coffee are property. It's not physical, something only one person at a time possesses, or which gets used up or worn out over time. Information is instead a virtual good, which can exist simultaneously in the minds of everyone.

Laws can and do provide limited control over information use (fraud) or its translation into physical copies (copyright). But there's no practical way to enforce a ban on its existence. All the tapes and disks can be erased, the books burned, and the archives destroyed. But the information will still exist, at least for anyone who happens to remember ever knowing it.

The costs and benefits of information exchange
Even limiting a legal "right to be forgotten" to information supplied directly by the user could have a disastrous effect on digital life.

That's because the Internet economy uses information, including personal information, as its main source of fuel. So when a user enters identifying information into an online ordering service, or uses a social network to exchange and store messages, photos, and videos, an economic exchange takes place, trading value for value.

Amazon.com is cheaper than a physical retailer, in part because customers do their own data entry. Google, Facebook, and Twitter are all free because we let the companies' computers scan our interactions with others to offer personalized advertising. Other providers may use the information I enter to build databases of aggregated user behaviors, leading to better, more useful future products and services.

Sometimes the individual is compensated, sometimes not. But once the omelet is made, the eggs can't be unscrambled. Releasing personal information into the world, in other words, may impose a cost on the individual, but the benefit to everyone else outweighs it. The sum is very often greater than the parts.

If users have the right to be forgotten, what happens to service agreements and informal arrangements that trade information for value? Does a user attempting to erase facts have to pay back the value they received? If not, imagine what becomes of services that are subsidized, in whole or in part, on personal information. What value is there to facts that can be "rectified" at the whim of the individual?

Like it or not (realistically both), we are leading an increasingly visible existence. Our always-limited ability to compartmentalize our lives is fading. But the benefits of more easily and safely interacting with others that comes from that transparency more than compensates. That, in any case, is the delicate balance that a right to be forgotten challenges.

Of course, the rising anxiety over privacy is understandable. Technology is rapidly changing the nature of human relationships, blurring the private-public line faster than many of us are comfortable with. Anxiety, predictably, leads to calls for legislative solutions. But here, as with so many disruptive technological advancements, the unintended consequences of regulating too soon could be fatal.

This is not to say there is no room for improvement or no role for government. We can certainly do a better job of making clear the economics of information exchanges with our public, private, and individual interactions. Governments can provide mechanisms for enforcing agreements that limit the use or publication of that information for different purposes. But we can't turn back the clock. And we certainly can't rewrite history.

This doesn't mean that the EU isn't going to try. But if the Commission really wants to protect its citizens from information tyranny, let's hope that it doesn't try. A memory hole is a bad idea, no matter who is using it.