The people have spoken -- just not enough of them to have an impact on Facebook's business or future.
First, there are the yeas versus the nays. In the voting on the giant social network's proposed policy changes, 589,141 users (87.5 percent) opposed the changes, and 79,731 were in support. But then there's the threshold: according to Facebook's voting laws, an enforceable verdict required participation by 30 percent (300 million) of Facebook's billion users. The total vote count of 668,872 was just 0.0668 percent of potential voters.
Now Facebook can ditch the voting structure and fully integrate its Instagram acquisition, as it should.
The new, and soon-to-be ratified, Data Use Policy and Statement of Rights and Responsibilities includes provisions that allow Facebook to store Instagram's server logs and administrative records in a "way that is more efficient than maintaining totally separate storage systems." This change will also allow Facebook to target Instagram users with ads using the data stored on Facebook's servers, such as their hometowns and likes.
What's clear is that the vast majority of Facebook's inhabitants don't read the terms of service and don't feel compelled to exercise their right to vote. That behavior is common across the Web, unless a company does something that really ticks off a huge number of people, not just a vocal minority. The changes to Facebook's voting system and privacy policies didn't rise to a level that would spawn Super PACs or a massive voter turnout.
Basically, Facebook users rendered their opinion on the measures on the table by not voting, leaving the nearly 88 percent who voted against the changes feeling powerless.
Marc Rotenberg, president of the Electronic Privacy Information Center, believes that Facebook should withdraw the proposed changes and stand by the majority, even if it's only a tiny percentage of potential voters.
But Facebook never expected to get 300 million out of a billion monthly Facebook users to weigh in. It's not a democracy or state with mandatory and binding elections every year to decide company policy. It's more like a local bar, where people go to gossip, market themselves and their brands, and share news and photos.
In fact, Facebook is a private enterprise that believes it has a public trust to connect people around the world and make our lives better. At the same time, it is designed to make money by targeting its billion members with ads guided by the data Facebook users pour into the company's servers.
Facebook states its mission as giving people the "power to share and make the world more open and connected." CEO and founder Mark Zuckerberg is a kind of benevolent, sometimes consensus-driven, dictator, as is any worthwhile chief executive who leads a public company and wants to avoid making the natives -- customers, employees, and shareholders -- restless.
In the video to the right, Zuckerberg explained the Facebook governance process when the service had just 200 million members.
But as Facebook continues its colonization, adding a few more billion members and exabytes of data within its walled garden, it could become a de facto nation-state in cyberspace, which would spawn an entirely new and unprecedented set of governance issues.
What happens when 3 billion or 5 billion people are all connected through a single entity that seeks to mirror in the digital world how people interact in the real world? How does Facebook interface with various governments as it wields more power? In the future, will Facebook users acquiesce to the company's artificially intelligent bots shaping their world? Or more near term, can Facebook be trusted to put its users' interest above all else?
Of course, no one is held captive by Facebook. People are free to leave, although it's not clear how they could take their precious data with them. In the meantime, Facebook's brain trust and algorithms will be looking for signs of unrest and revolt among its masses, and like any government with good intentions, ready to quell an uprising with reason and fairness.