X

Facebook, Cambridge Analytica and data mining: What you need to know

The world's biggest social network is at the center of an international scandal involving voter data, the 2016 US presidential election and Brexit.

Ian Sherr Contributor and Former Editor at Large / News
Ian Sherr (he/him/his) grew up in the San Francisco Bay Area, so he's always had a connection to the tech world. As an editor at large at CNET, he wrote about Apple, Microsoft, VR, video games and internet troubles. Aside from writing, he tinkers with tech at home, is a longtime fencer -- the kind with swords -- and began woodworking during the pandemic.
Ian Sherr
15 min read
mark-zuckerberg-oculus-connect-venues-concert-music-7505

Facebook CEO Mark Zuckerberg

James Martin/CNET

Consultants working for Donald Trump's presidential campaign exploited the personal Facebook data of millions.

Last month, The New York Times and the UK's Guardian and Observer newspapers broke news the social networking giant was duped by researchers, who reportedly gained access to the data of millions of Facebook users and then may have misused it for political ads during the 2016 US presidential election. Facebook said it was investigating the reports, which involved data consultancy Cambridge Analytica.  

Over the past three-plus weeks, the situation has snowballed. Facebook CEO Mark Zuckerberg was in Washington this week to testify before Congress. Meanwhile, the number of accounts affected has risen to 87 million from initial reports of 50 million. Separately, Facebook said it was purging pages linked to a Russian troll farm that's known for creating fake online identities and posting on both sides of politically divisive issues.

Cambridge Analytica reportedly acquired the data in a way that violated the social network's policies. It then reportedly tapped the information to build psychographic profiles of users and their friends, which were used for targeted political ads in the UK's Brexit referendum campaign, as well as by Trump's team during the 2016 US election. 

Facebook says it told Cambridge Analytica to delete the data, but reports suggest the info wasn't destroyed. Cambridge Analytica says it complies with the social network's rules, only receives data "obtained legally and fairly," and did wipe out the data Facebook is worried about.

Here's what you need to know.

Watch this: Did Facebook lose control of your information?

What is Cambridge Analytica?

Cambridge Analytica is a UK-based data analytics firm, whose parent company is Strategic Communication Laboratories. Cambridge Analytica helps political campaigns reach potential voters online. The firm combines data from multiple sources, including online information and polling, to build "profiles" of voters. It then uses computer programs to predict voter behavior, which could be influenced through specialized advertisements aimed at the voters.

Cambridge Analytica isn't working with a small amount of user data. The company says it has "5,000 data points on over 230 million American voters" -- or pretty much all of us, considering there are an estimated 250 million people of voting age in the US

The company has since faced criticism for what executives, including CEO Alexander Nix, said in a series of undercover videos shot by the UK's Channel 4. In the videos, Nix discussed lies and apparent blackmail he'd perform as part of his efforts to sway elections.

"We have lots of history of things," Nix said in the videos, "I'm just giving you examples of what can be done and what, what has been done."

Nix has since been suspended from his job as CEO. His comments "do not represent the values or operations of the firm and his suspension reflects the seriousness with which we view this violation," the company said in a statement.

What did Cambridge Analytica do?

Facebook said in a statement on March 16 that Cambridge Analytica received user data from Aleksandr Kogan, a lecturer at the University of Cambridge. Kogan reportedly created an app called "thisisyourdigitallife" that ostensibly offered personality predictions to users while calling itself a research tool for psychologists.

The app asked users to log in using their Facebook accounts. As part of the login process, it asked for access to users' Facebook profiles, locations, what they liked on the service, and importantly, their friends' data as well.

Facebook logo on mousepad

Facebook's data appears to have been improperly used for political purposes during the UK's Brexit vote and the 2016 US presidential election.

Getty Images

The problem, Facebook says, is that Kogan then sent this user data to Cambridge Analytica without user permission, something that's against the social network's rules.

"Although Kogan gained access to this information in a legitimate way and through the proper channels that governed all developers on Facebook at that time, he did not subsequently abide by our rules," Paul Grewal, a vice president and general counsel at Facebook, said in a statement.

Kogan didn't respond to requests for comment. The New York Times said he cited nondisclosure agreements and declined to provide details about what happened, saying his personality prediction program was "a very standard vanilla Facebook app."

A former Cambridge Analytica executive, Brittany Kaiser, said it's possible more people's profiles have been caught up in the scandal than the 87 million Facebook has so far counted. "It is almost certain," she said in a hearing before the UK Parliament's Digital, Culture, Media and Sport (DCMS) committee on April 17. 

What does this have to do with Trump?

The Trump campaign hired Cambridge Analytica to run data operations during the 2016 election. Steve Bannon, who eventually became Trump's chief strategist, was also reportedly vice president of Cambridge Analytica's board. The company helped the campaign identify voters to target with ads, and gave advice on how best to focus its approach, such as where to make campaign stops. It also helped with strategic communication, like what to say in speeches.

"The applications of what we do are endless," Nix said last year in an interview with CNET sister site TechRepublic.

The White House didn't respond to a request for comment.

Cambridge Analytica also worked with other 2016 presidential election campaigns, according to its website and various media reports. Those included the campaigns of Sen. Ted Cruz and candidate Ben Carson, who went on to join Trump's cabinet as secretary of housing and urban development

Why did Facebook ban Cambridge Analytica from its service? 

Facebook said Cambridge Analytica "certified" three years ago it had deleted the information, as did Kogan. But since then, Facebook said, it's received reports that not all the user data was deleted. The New York Times reported at the outset of this controversy that at least some of it remains.

Cambridge Analytica said in a statement that it deleted all the data and is in contact with Facebook about the issue.

Meanwhile, Christopher Wylie, the whistleblower who detailed how Cambridge Analytica reportedly misappropriated the Facebook data, said on Twitter that his Facebook account had been suspended. A few days later, he held a press conference to discuss his situation and the larger controversy.

"I'm actually really confused by Facebook," Wylie said. "They make me out to be this suspect or some kind of nefarious person."

Was Facebook hacked?

The New York Times characterized the original problem as a data "breach" and said it's "one of the largest data leaks in the social network's history." That's in part because the roughly 270,000 users who gave Kogan access to their information allowed him to collect data on their friends as well. In total, more than 87 million Facebook users are said to have been affected.

The misuse of this data is what The New York Times zeroed in on.

Facebook, however, says that while Kogan mishandled its data, all the information Kogan got was accessed legally and within its rules. The problem is that Kogan was supposed to hold on to the information himself, not hand it over to Cambridge Analytica or anyone else. Because the information was accessed through normal means, Facebook disputes the characterization of the incident as a breach.

"People knowingly provided their information, no systems were infiltrated, and no passwords or sensitive pieces of information were stolen or hacked," the company said.

Of course, critics point out that Kogan was able to do what he allegedly did because Facebook allowed app developers to request and receive access to the data of users' friends. Facebook changed that policy in 2015, prohibiting the practice.

Wait, so Facebook allows apps to access my data?

When you log in to an app using your Facebook account, the developer typically asks for access to information the social network has. Sometimes it's just your name and email address. Other times, it's your location and your friends' data too.

All this is pretty much what any app developer that works with Facebook was allowed to do until 2015, when Facebook prevented app developers from accessing friends' data. Everything else, though, is still fair game.

Facebook says its rules specify that developers can't share the information they receive with other firms. That's where the problem with Kogan and Cambridge Analytica comes up.

The company has an app review process it puts developers through. Once they're cleared, things are A-OK.

You hand your information over to app developers all the time. Don't like it? Think before you click. And read the requests from app developers more carefully.

Facebook, by the way, is hoping to stop the next Cambridge Analytica. It's offered a bounty to anyone who finds apps that misuse Facebook data. The company has also revamped its tools to help you identify which apps have access to your data, as well as those to strengthen security of your profile. Facebook also made it easier to download data it has on you.

Could this lead to more regulation?

Zuckerberg himself said it might. 

"I'm actually not sure we shouldn't be regulated," he said in an interview with CNN on March 21. "The question is, what is the right regulation?"

He answered that question on April 6, saying he supports the Honest Ads Act, a proposed law that would require tech companies to disclose how political ads are targeted and how much they cost.

Facebook CEO Mark Zuckerberg, in front of signs that say "Focus on Impact" and "Be Bold."

Lawmakers on both sides of the Atlantic are looking for answers from Facebook CEO Mark Zuckerberg.

James Martin/CNET

Regardless of whether that bill becomes a law, there's one thing we know for sure: The honeymoon between the tech industry and government is over. After decades of (mostly) treating tech companies as favored children, legislators and government regulators are increasingly taking a tougher stance against them.

Already, this scandal has renewed calls for more regulation

"This latest fiasco could reignite the debate within the Beltway and EU around a tighter regulatory environment Facebook and its social platform brethren could face going forward," Daniel Ives, an analyst at GBH Insights, wrote in a note to investors right after the controversy erupted. "This represents another critical period for Facebook to hand hold and assure its users and regulators around tighter content standards and platform security in light of this latest PR nightmare."

Facebook also faces an investigation by the Federal Trade Commission over whether it violated a 2011 consent decree. Companies that have settled previous FTC actions, the US agency said, must comply with FTC order provisions imposing privacy and data security requirements.

"Accordingly, the FTC takes very seriously recent press reports raising substantial concerns about the privacy practices of Facebook," the agency said in a statement on March 26. "Today, the FTC is confirming that it has an open non-public investigation into these practices."

The consent decree required that Facebook must get users to agree to and must notify them about the social network sharing their data. Facebook earlier told The Washington Post it rejects "any suggestion of violation of the consent decree."

In Europe, where regulators have traditionally taken a tough stance on social media and privacy, the president of the European Parliament, Antonio Tajani, tweeted that EU lawmakers "will investigate fully, calling digital platforms to account." In the UK, Damian Collins, the chair of Parliament's committee overseeing digital matters, said Zuckerberg needs to stand up and answer questions directly.

What happened in Zuckerberg's appearance before Congress?

A little over three weeks after the Cambridge Analytica news broke, Zuckerberg went to Washington, where over two days he endured 10 hours of questioning by congressional committees. Echoing earlier statements, he apologized to lawmakers for Facebook's recent missteps and voiced support for some regulation of the tech industry.

In his first day of testimony, he did score some points. Zuckerberg addressed a room full of Senate Judiciary and Commerce Committee members who struggled to understand what Facebook does, how the social platform works, and how to regulate it. He escaped largely unscathed, having settled into his role as both an explainer of technology and a receiver of the occasional finger-wag.

But on day two, things got a little rougher. His appearance before the House of Representatives' Energy and Commerce Committee was defined by pointed questions from lawmakers who appeared to have done their homework.

Some, like New Jersey Rep. Frank Pallone, hammered Zuckerberg on default privacy settings. California Rep. Anna Eshoo asked Zuckerberg if his own data was swept up in the Cambridge Analytica scandal. (He said that it was.) And Florida Rep. Kathy Castor and New Mexico Rep. Ben Lujan raised concerns about how much Facebook follows people as they browse the web -- and whether people without accounts on the social media network still get tracked via "shadow profiles." Zuckerberg said that he wasn't familiar with that term and that Facebook collects data on nonusers for security purposes.

"Your business is built on trust, and you're losing trust," Lujan said.

But apparently Facebook hasn't lost Wall Street's confidence. The company's shares rose approximately 5 percent over the two days of testimony.

Was this similar to what the Obama campaign did on Facebook?

Sort of. The Obama campaign did collect a similar level of data from its app, which includes both your information and your friend's information. 

But as Politifact notes, users were willingly giving up that information and knew it was going to a political campaign. The Obama campaign used your friend's data to figure out who may or may not be willing to vote for him, and sent messages to users to persuade their friends. 

That's different from the Cambridge Analytica situation, since most users taking the digital life quiz had no idea that the data would be used for political purposes. 

What's Facebook doing about this?

After five long days, Zuckerberg broke his silence on March 21 with a nearly 1,000-word post on his Facebook page. (C'mon, did you really expect it to show up on Twitter?) The post was his first since since March 2, when he shared a photo of his family celebrating the Jewish holiday of Purim

Zuckerberg acknowledged that Facebook had made mistakes with users' information. "We have a responsibility to protect your data," he wrote. "And if we can't then we don't deserve to serve you."

He's since sat down for several media interviews, and on April 4, held an hour-long conference call with journalists. "Life is learning from mistakes," Zuckerberg said. "At the end of the day, this is my responsibility. I started this place, I run it, I'm responsible."

The company, he said, is now facing two central questions: "Can we get our systems under control and second, can we make sure that our systems aren't used to undermine democracy," Zuckerberg said.

"It's not enough to give people a voice, we have to make sure that people are not using that voice to spread disinformation," he added.

And, specifically, he acknowledged that Facebook has "to ensure that everyone in our ecosystem protects people's information."

He's promised to investigate apps that had access to "large amounts of information" before the company made changes to how much information third-party apps could access in 2018. Facebook will conduct a full audit of apps that exhibit suspicious behavior and bar developers who don't agree to audits.

On April 6, Facebook said it was banning AggregateIQ, another political analytics firm that's reportedly tied to Cambridge Analytica's parent company, SCL. (Aggregate IQ denies this connection.) Facebook said it instituted the ban out of concern that AggregateIQ may have improperly received Facebook user data as well.

Facebook's public missteps have brought up other concerns about Facebook too. One example is a memo leaked to BuzzFeed penned by Andrew "Boz" Bosworth, a top Facebook executive. The 2016 memo advocates growth above everything else, regardless of whether people use Facebook to bully and harass one another. 

"The ugly truth is that we believe in connecting people so deeply that anything that allows us to connect more people more often is *de facto* good," he wrote at the time. He's since said he was trying to stir debate, and didn't agree with what he'd written.

Facebook is also planning to restrict how much access developers have to your information, limiting the information it gives apps to your name, photo and email address. It'll also revoke an app's access to your data if you haven't used it for three months. 

The company is also planning to further restrict political advertising, Sheryl Sandberg, Facebook's COO, said in an interview with Bloomberg. "If you were using hate-based language in ads for elections, we're drawing those lines much tighter and applying them uniformly," she said.

Last, Facebook will begin displaying a gauge at the top of your News Feed that lets you know which apps you've used and let you revoke their permissions.

All of that will provide comfort to many users, but for others ...

Watch this: Facebook needs to regain the public's trust, says New York AG

Are people bailing from Facebook?

They are, though it's still too early to know if that'll have a substantial effect on Facebook's gargantuan user numbers. Right off the bat, the hashtag #DeleteFacebook flared up on Twitter -- backed by, notably, Brian Acton, WhatsApp's co-founder who sold the messaging service to Facebook for $19 billion.

We're also starting to see some action that could hit Facebook in the wallet. Within days of the scandal erupting, Firefox maker Mozilla said it would no longer advertise on Facebook because of data privacy concerns, and it launched a petition to ask the social network to improve its privacy settings. Meanwhile, Tesla and SpaceX CEO Elon Musk has taken a different kind of stand. Prompted by an inquiry from a Twitter user, he quickly deleted both companies' Facebook pages. So did Playboy, for what it's worth.

Beyond those high profile moves, a recent survey from the anonymous employee social network Blind found that 31 percent of tech workers plan to delete Facebook too. Coverage of Facebook has turned negative too, a survey by BuzzFeed found

Still, Zuckerberg said in a call on April 4 that the larger #DeleteFacebook campaign hasn't had a noticeable effect on its active user counts

Ultimately, reform is what's needed, said former Cambridge Analytica executive Brittany Kaiser. "For many years, I never questioned it," Kaiser said. "That's the way that the political system works. That's the way that advertising works. That's the way that every single industry that exists in the entire basis of digital communications works. I do really understand the industry, and I have the ability to be a voice for change."

What can I do?

There isn't much. You may've been swept up in this without even knowing it. You don't have to have downloaded Kogan's app to have had your information accessed, since the statements and articles say the app slurped up information about users' friends.

Cambridge Analytica also doesn't appear to offer a way for you to request your information be removed from its systems. The company didn't respond to a request for comment.

As for Facebook, you can always try to lodge a complaint with Zuckerberg.

You should also check your privacy settings on Facebook and consider these ways to stop sharing data with Facebook. And if you're really unhappy, you could get involved in a class action lawsuit. You could also join the #DeleteFacebook campaign. Here's how to do it.

First published March 17 at 1:52 p.m. PT.
Updates, March 18 at 3:21 p.m.: Adds analyst comment about regulation; March 19 at 10:17 a.m.: Adds info on calls for action in Washington and Europe; 5:14 p.m.: Includes summary of Washington Post report questioning whether Facebook violated consent decree; March 20 at 9:32 a.m.: Adds info on potential FTC investigation; 3:32 p.m.: Includes details on Zuckerberg's silence and Channel 4's undercover investigation of Cambridge Analytica; 6:14 p.m.: Adds comment from Chris Wylie and details about #DeleteFacebook campaign and class-action lawsuit; March 21 at 1:57 p.m.: Includes Zuckerberg's Facebook post and plans to improve data security; 6:35 p.m.: Adds Zuckerberg's comments in interviews; March 23 at 12:18 p.m.: Adds info on Mozilla and Musk seeking distance from Facebook. March 26 at 9:12 a.m. PT:  Adds FTC's confirmation that it's investigating Facebook. 10:17 a.m. PT:  Adds Senate Judiciary's invitation for Zuckerberg to testify. March 28 at 4:57 p.m. PT: Adds details about Facebook's new privacy tools, a survey about tech workers deleting Facebook and Playboy shutting down its Facebook presence. March 30 at 10:35 a.m. PT: Adds details about a leaked Facebook memo advocating growth despite issues like online harassment. April 4 at 6:35 a.m. PT: Adds that Zuckerberg will testify before Congress on April 11. April 4 at 1:02 p.m. PT: Updates the number of accounts affected; 5:13 p.m.: Adds Zuckerberg comments from conference call with journalists. April 5 at 3:03 p.m. PT: Adds comment from Sandberg about political advertising. April 6 at 9:39 p.m. PT: Adds Facebook banning AggregateIQ and Zuckerberg supporting the Honest Ads Act. April 9 at 11:40 a.m. PT: Adds a note about the Obama campaign. April 12 at 10:51 a.m. PT:  Added information about Zuckerberg's appearances before the US Senate and House of Representatives. April 18 at 5:10 p.m. PT:  Added latest information about Cambridge Analytica and how coverage of Facebook has gone negative.

Watch this: Protect your data on Facebook

iHate: CNET looks at how intolerance is taking over the internet.

Tech Enabled: CNET chronicles tech's role in providing new kinds of accessibility.