X

Note to Silicon Valley: How not to manage privacy

Stanford Law Fellow Larry Downes says the real problem behind recent privacy gaffes and missteps--including Facebook's--is not of policy but rather of public relations.

Larry Downes
Larry Downes is an author and project director at the Georgetown Center for Business and Public Policy. His new book, with Paul Nunes, is “Big Bang Disruption: Strategy in the Age of Devastating Innovation.” Previous books include the best-selling “Unleashing the Killer App: Digital Strategies for Market Dominance.”
Larry Downes
7 min read

Editors' note: This is a guest column. See Larry Downes' bio below.

It's been a bad week for those, like me, who feel the debate over data privacy too often casts information businesses as evil Halloween monsters, determined to terrorize and humiliate their customers just for the fun of it.

On Monday, the Federal Trade Commission held the first of three conferences on privacy and technology, at which a parade of consumer advocates and legal scholars warned of an imminent data apocalypse.

Recent events seemed, alas, to support that view. Sprint, for example, reported that over the last 13 months, it has received more than 8 million requests for GPS data about customer location and movement from law enforcement agencies. (Sprint is now determining the number customers affected, estimated to be in the thousands.)

Verizon and Yahoo filed objections to a Freedom of Information Act request that asked how much the companies charge to comply with government surveillance orders, claiming that release of the information would "shock" and "confuse" customers.

Then, Google's notoriously private CEO, Eric Schmidt, brushed aside a CNBC's reporter's question about concerns that users are putting too much trust in his company, saying, "If you have something that you don't want anyone to know, maybe you shouldn't be doing it in the first place."

Most disturbing at all is what happened over at Facebook, the social-networking behemoth that now hosts more than 350,000,000 members. Based in part on complaints by government agencies in Canada and Europe, the company announced in July that it had begun testing a more comprehensive and simplified set of privacy settings, promising to give users "even greater control over the information they share and the audiences with whom they share it."

After months of what looked like careful planning, Facebook implemented its new privacy policy and user tools this week.

The announcement landed flat on, well, flat on its face. A chorus of the usual suspects, including the Electronic Frontier Foundation and the American Civil Liberties Union of Northern California cried multiple fouls, objecting both to the nature of the changes and the way in which they were being imperiously foisted on users. "Under the banner of simplification," said Electronic Privacy Information's Center's Marc Rotenberg, "Facebook has pushed users to downgrade their privacy."

First, a word about the changes themselves. In a detailed exegesis published on Wednesday, EFF's Kevin Bankston divided the revisions into three categories: the good, the bad, and the ugly.

In the good column, Bankston noted that all Facebook users are being required to review their privacy settings and have been given new tools to simplify the process. For each individual post to their page, users can now limit who among their friends gets to see what. In the bad department, EFF doesn't like the recommended settings, which pretty much let everyone see everything.

The ugly, however, are genuinely ugly. The version of a user's Facebook page open to Facebook members and nonmembers alike will now show the user's name, profile picture, location, and gender, as well as a complete list of her friends. Most of that information can no longer be controlled other than by not providing it in the first place. (Facebook has already backtracked on the public availability of friends information.) And users can no longer opt out of letting Facebook and third-party applications, such as all those quizzes and tests my friends seem to spend most of the day filling out, access at least some information from their account and that of their friends.

Logic behind privacy policy changes
I understand why Facebook wants these changes. Given the sheer number of Facebook users, it's increasingly difficult to find friends when presented with a list of dozens of profiles with matching names and no other information.

As the company moves to find ways of making money from its network, moreover, open access to information about users is not just important--it's essential. Constraining the company's ability to publish and otherwise monetize that information limits the chances Facebook and other social-networking sites can continue to secure funding, compete in a wide-open market, and ultimately survive as a commercial enterprise.

That, at least, is the kind of reasonable explanation for the changes the company could have provided. Instead, it announced the new policy and implemented it at the same time, leaving no opportunity for user review or comment. According to EFF's Bankston, Facebook didn't disclose the creation of the new category of "publicly available information,"--that is, information about a user that cannot be controlled--until "the very day it is forcing the new changes on users." (Facebook did, in fact, allow a one-week comment period on a draft of the new policy, which is more than 5,000 words long, in early November.)

The company's reliance on good relations with its users makes the ham-fisted and tone-deaf nature of these changes both "shocking" and "confusing." After a minirevolt erupted earlier this year over changes to Facebook's terms of service, in which the company seemed to grant itself a more generous license for user data, a chastened CEO Mark Zuckerberg quickly reversed course.

More than that, Zuckerberg promised that future modifications would be developed in collaboration with users on an open-source model. "Our terms aren't just a document that protects our rights," Zuckerberg wrote on the company's blog, "it's the governing document for how the service is used by everyone across the world. Given its importance, we need to make sure the terms reflect the principles and values of the people using the service."

Exactly. So why didn't Facebook learn from its own painful lesson? While the company tested the new features with some users and solicited comments on the privacy policy over the last several months, Facebook reported in November that the number of comments it received on its draft proposal "did not reach the threshold to hold a vote." That's not a good thing.

Lessons not learned
Despite the high level of emotion, rightly or wrongly, that users attach to the topic of privacy, the new policy and tools simply arrived, providing some new protections even as existing controls were unceremoniously removed. Did the company think no one would notice? These and other recent privacy gaffes and missteps have unfortunate consequences.

Consumers, already uneasy about how increasingly intimate information is being handled online, will trust companies less, raising the potential for government regulations and new privacy agencies to fill a perceived void. That would be a dangerous result, and ultimately a counterproductive one.

Introducing new layers of regulatory bureaucracy will slow the pace of exciting innovations in information technology that have kept users engaged in the first place. And interjecting government oversight over any data raises the possibility of misuse of that information by other parts of the government, a problem made all too clear by continued revelations about secret surveillance under the wide umbrella of the Patriot Act and other antiterrorism measures.

The reality is that most information services do a good and responsible job of balancing user interests in controlling information access with value derived from transactional and other data that pay for much of what happens online.

Though often implicit, users today trade the use of information about their activities, purchases, and interests for innovative and often free services that analyze and aggregate that data. Such services help cell phone users locate their friends with Loopt, consumers simplify their search for products and services on Amazon and eBay, and connect with each other in the low transaction cost world of social-networking applications such as Facebook and Twitter.

The real problem: PR
The real problem here is not of policy but rather of public relations. Start-up companies increasingly invest early and often in legal counsel, in part to navigate the complex waters of intercompany relationships and in part to avoid potentially lethal litigation from patent trolls, unhappy competitors, and a global army of business regulators.

At the same time, marketing, as well as public and government relations, get little attention, as companies believe that enthusiastic users are now the best form of PR a young company can get and at a price that can't be beat.

Maybe so. But as information exchanges have moved from the purely pedestrian business-to-business networks of the 1980s to the everything-and-everybody sharing that characterizes our increasingly digital lives, companies who discount or dismiss the emotional and even irrational attachment consumers have to information about themselves do so at their peril.

It's not that Google, Facebook, and others need to change in any fundamental way how they do business. They must rather rethink the casual, careless, and often conceited way with which they communicate to users, business partners, regulators, and other stakeholders. When the lawyers lead, everyone loses.

For companies like Facebook today and everyone else tomorrow, users and the data they provide are not just the most valuable asset; they are the only asset. As consumers absorb that fact, they will increasingly use the tools of online communities--ironically, tools provided by social-networking sites themselves--to express their dissatisfaction with unequal exchanges of information for value. Better to collaborate with them now than to negotiate later, at the end of a gun.

Facebook, as Mark Zuckerberg correctly noted, is a kind of virtual nation, where terms of service and other policy documents serve as Constitution and governing law. As such, changes to both policy and practice require honest deliberation and engagement with the residents.

They can no longer be delivered as fait accompli. For one thing, it's pretty easy for virtual citizens to revolt against a government they don't like, or simply pack up and move somewhere less tyrannical. Easier than it is in the physical world, in any case.