X

Facebook's impending fight with D.C. (FAQ)

Four senators have petitioned Facebook CEO Mark Zuckerberg highlighting concerns about handling of user data. Here's how this new scrutiny could shape the immediate and long-term future of Facebook.

Caroline McCarthy Former Staff writer, CNET News
Caroline McCarthy, a CNET News staff writer, is a downtown Manhattanite happily addicted to social-media tools and restaurant blogs. Her pre-CNET resume includes interning at an IT security firm and brewing cappuccinos.
Caroline McCarthy
8 min read

You probably have a Facebook account--well over 400 million people do. You've probably noticed that the look and feel of your profile have recently changed (again).

And you've probably heard a lot recently about Facebook changing its privacy policies (again). Maybe you've even seen something about a rumor that Facebook employees say offhand that CEO Mark Zuckerberg "doesn't believe in" privacy--and how some people very high up in Washington are starting to take notice. Will government intervention in Facebook be saving you from unwanted snooping or just interfering in your Mafia Wars games? A lot's still unanswered, but here's what you need to know now.

What happened?
Basically, mounting concerns about Facebook's handling of users' private data have hit a tipping point: Sen. Charles Schumer (D-N.Y.) petitioned the Federal Trade Commission early this week to request that the agency address the issue of social networks' privacy policies. The next day, Schumer teamed up with three other Democratic senators on an open letter to Zuckerberg to express similar concerns.

There was no scandal to ignite it all. This happened because of a series of big announcements by Facebook earlier this month at its F8 developer conference, all of which detail its ambitions to be the arbiter of digital identity or, if you like transportation metaphors, to own and control the roads that bring together the far-reaching corners of the social Web. It's complex, but to sum it all up, your Facebook profile information will now be able to be transported to third-party sites in far deeper ways than its Facebook Connect universal log-in service offered.

To set the groundwork for this, Facebook has modified the content of user profiles to once again push more of it public by default. Through a new feature called "Instant Personalization," Facebook users now must specifically opt out if they don't want third-party partners--currently limited to beta partners Yelp, Microsoft, and Pandora--to have access to their friends list as well as those friends' publicly available Facebook information.

Why is this happening now?
To be sure, Facebook has had six years to build up a stockpile of highly personal user information. The company's servers are home to photos, videos, private messages, and far more--and this is nothing new. But generally it's not mere storage that raises lawmakers' concern, it's when that data might be shared. As we've seen in the government scrutiny over behavioral advertising, when third-party networks and partners are brought into the mix, that's when D.C. starts to take notice.

To look at the situation in terms of tabloid headlines, sharing user data is an easy-to-grasp, hot-button issue prone to the tossing around of adjectives like "creepy" and "Orwellian," it riles up both the left and the right, and one thing that politicians don't like is an unbridled Big Brother growing like a weed out of Silicon Valley. Plus, this is where social networks start to veer into the territory of existing advertising regulations on both the state and federal level, so it's much easier for critical lawmakers to point fingers.

But Facebook's been sharing data with third-party partners for years now. There was the ill-fated Beacon advertising program, which was modified and then eventually shelved after protests from liberal activist group MoveOn.org and eventually a class-action lawsuit. But on a more successful note, Facebook's developer platform (first launched in spring 2007) and original Facebook Connect product had been integrating third-party partners into the social-networking site for quite some time.

There have been plenty of minor to moderate criticisms of Facebook's handling of user data, particularly with regard to third-party partners and advertisers, over the years. And plenty of state and federal laws have been invoked, from California consumer privacy laws to the relatively obscure Video Privacy Protection Act of 1987 to commercial appropriation laws that usually only apply to celebrities.

But there's no big, sweeping policy designed to apply specifically to social networks, and that is why we're seeing this attention from a handful of senators. Facebook's recent moves are its most audacious ever with respect to user data and third parties--just look at all the reactionary headlines full of phrases like "own your identity" and "rule the Internet"--and for the most part right now there is little legal precedent.

So, should there be a law against this?
That's not entirely clear. Schumer and his colleagues aren't claiming that Facebook's new "Open Graph" initiatives are illegal. The main argument on behalf of lawmakers is that Facebook has modified its privacy policies so liberally and so frequently that it's left users anywhere from befuddled to betrayed. Past changes to Facebook's privacy policies have left users wondering whether it was giving itself license to make private photos public or even selling data to advertisers without letting users opt in first.

One of the reasons that Facebook was able to grow as big as it has was because it started from such minimal roots. In early 2004, when Zuckerberg was building the site in his dorm room, many an average Internet user was uncomfortable using his or her real name on the Web, let alone uploading albums full of photos or sharing a GPS-enabled location. The original Facebook was a private social-networking site for students from a single college--as verified by e-mail address. Even after it was open to any member to join, it was still hidden behind a log-in wall. Most of Facebook's modifications are like continental drift: so slow that members, perhaps, were unaware of how dramatically the product was changing from the one they'd originally signed up for.

Metaphor time. Let's say you are renting a home in a housing development where there are 10-foot-tall brick walls separating each house from the next. Over the years, the corporation that owns the development starts to remove those walls, brick by brick, while charging the same amount for rent. At some point, they're low enough so that people complain and say that the company is at fault for changing the product that you agreed to pay for every month. The complication here is that a Facebook account is free, so the flip side of the argument is that because Facebook members aren't paying, the company does not have an obligation to maintain those terms. Expect this to be a point of contention.

Why hasn't this kind of force been directed at Google?
Well, it has. Lawmakers have scrutinized Google's enormous scope quite a few times over the years, from questions facing its acquisition of DoubleClick to how it handles the privacy implications of behavioral advertising. It's been subject to antitrust scrutiny in general, too. And the Department of Justice was about to start poking its nose into a proposed Google-Yahoo search deal until Google itself pulled the plug.

Again, this is an issue where the matter of concern to politicians is not the massive trove of user information that Google stores, but the likelihood that it will share this with third parties or abruptly make parts of it public. It may be a blessing in disguise that Google's endeavors into social networking have been lukewarm at best. Had Google Buzz, which was greeted at launch with scathing criticisms of its handling of user privacy, been a bigger success, D.C. might have spoken up. Ten privacy commissioners from around the world petitioned Google CEO Eric Schmidt with concerns about Buzz, but the U.S. FTC wasn't part of it.

Did Facebook see this coming?
Yes: just take a look at some of the company's hiring patterns. Two years ago, the company hired Google veteran Elliot Schrage as its head of global communications and public policy. Schrage brought with him a legal background and a history of dealing with both lawmakers and lobbyists; late last year, the company hired D.C.-based former journalist Andrew Noyes to handle Beltway-specific issues like "enhancing cybersecurity and online safety, expanding digital privacy protection through user control of data, and protecting free speech" (per a statement from the company at the time). It's obvious that Facebook knew that as it grew, it would have to start dealing with lawmakers, lobbyists, and regulators. This is a company that wanted to establish a presence in government circles before it necessarily needed to.

At the same time, we've heard from insiders that the nature of the recent concern from D.C. lawmakers has indeed made Facebook nervous. The company knew that they'd be coming into the Beltway's crosshairs but they were not expecting the immediate force so soon after this month's F8 conference. Right now, Facebook is undoubtedly concerned not just by the potential clamps that regulation, FTC or otherwise, could put into place, but by the negative PR that might turn off consumers and advertisers like it did with Beacon.

So when does Mr. Zuckerberg go to Washington?
We asked, but Facebook has not yet replied to an inquiry about whether the company may be pulled onto Capitol Hill any time soon. If Google's history with D.C. is any indication, yes, we'll be seeing Facebook executives from Palo Alto head east to put their best face forward.

What might change?
In the short term--as in the past--Facebook may make some modifications, however superficial, to alleviate image concerns that could affect its relationships with advertisers and other partners or cause user dissatisfaction to snowball. We saw this with Beacon, which was eventually shuttered, and with prior modifications to Facebook's privacy policies that users found confusing or unnerving.

It's unlikely that Facebook will give in to critics' requests and go so far as to make "Instant Personalization" an opt-in rather than opt-out program, since that could completely derail the program's effectiveness--for better or for worse. Changes on the current privacy front may be more in the language used to clarify them than in the privacy controls themselves.

If Schumer has his way, there will be far more long-term changes as well, with social-network privacy deserving of its own FTC regulation--and that would affect companies beyond Facebook, from Twitter to Google to the dozen or so "geolocation" start-ups out there. This could take some time to push through, though the FTC's handling of blogger freebie disclosures (something that affects far fewer people directly than social-network privacy) shows that it's willing to tackle digital-media issues in a swift manner that the tech industry doesn't always consider to be in its best interests.

For now, Facebook is standing firm. A response letter from Schrage to the senators insists that "the collective changes we announced last week will result in more control for users, not less" and that "we welcome a continuing dialogue."