After Facebook announced on Tuesday evening that it had found another set of disinformation campaigns, this time from both Russia and Iran, one thing in particular came to mind: "We came here for the friends."
It's the opening line of Facebook's now-infamous apology commercial, which first ran on TV in April as a mea culpa after the social network had been beat up by its Cambridge Analytica data scandal. The commercial goes on to say: "But then something happened. We had to deal with spam, clickbait, fake news and data misuse." (John Oliver brutally spoofed the ad last month on his show Last Week Tonight.)
But even though the commercial was meant to be a step in repairing Facebook's image, "We came here for the friends" serves as the perfect way to highlight how far the giant social media platforms have fallen and how broken they've become.
The scope of the problem is even bigger than we thought. US Sen. Mark Warner, a Democrat from Virginia, said it best in his reaction to Facebook's disclosures Tuesday: It's not just Russia trying to divide the public by abusing social media, it's other foreign actors, too, following the "Kremlin's playbook." The situation seems so rampant that Facebook freely admits, maybe a little passive-aggressively, that "no one company can fight this on their own," as the social network said in a blog post disclosing the news. (Facebook didn't even identify the campaigns itself -- the company was tipped off by the cybersecurity firm FireEye.)
What's more, the large scale of the campaigns Facebook took down Tuesday don't tell us much about how many more networks of fake accounts are still on the platform.
"What we don't know yet is just how pervasive this sort of activity is," said Dan Wallach, a professor of computer science at Rice University. "Is there an iceberg underneath that we don't see yet?"
Facebook didn't have any additional comment beyond yesterday's announcement. Twitter didn't immediately to a request for comment.
'Reactive to proactive'
But there's good news. If the last few days are any indication, the tech giants are trying in earnest to fix the problem. And they are making progress.
After Facebook's announcement Tuesday that it was removing 652 inauthentic pages, groups and accounts, Twitter followed up with a seemingly related disclosure. It said it suspended 284 accounts with ties to Iran for "coordinated manipulation." And a day before that, Microsoft (which we'll group in even though it's not a social media company) said it discovered and disabled several fake websites designed to trick visitors and allow a group connected to the Russian government to hack into their computers.
On Tuesday, Facebook CEO Mark Zuckerberg touted the change in strategy on a conference call with reporters.
"Security is not something that you ever fully solve," Zuckerberg said. "Our adversaries are sophisticated and well-funded, but the shift we have made from reactive to proactive detection is a big change and is going to make Facebook safer over time."
Facebook has done a number of things to fight the problem. Zuckerberg has said the company is investing in artificial intelligence tools to help police the platform for its more than 2 billion users. The company is also hiring 20,000 people to help moderate content.
The company also announced in July it started a special unit to try to spot problems and vulnerabilities before they blow up. The group, called the Investigative Operations Team, is made up ex-intelligence officers and researchers.
Bruce McConnell, an expert in global cybersecurity cooperation at the East West Institute think tank, said he's seen a change in tone from Zuckerberg. Recently, the CEO has been talking like a leader in cyberdefense from the NSA, McConnell said.
"There's a big shift here in stepping up to this role of helping ensure that their platform remains trusted," he said.
Twitter, for its part, says it's rethinking some of the fundamental elements of its service to prevent abuse and fake news. For example, CEO Jack Dorsey said last week one thing Twitter could do is surround tweets that spread misinformation with tweets that debunk it.
The influence campaign Facebook took down Tuesday included accounts and groups that reached more than 800,000 followers, which computer science expert Wallach said was an important number. Fake accounts are only useful, after all, if they have an audience.
"What that tells me," Wallach said, "is they're going after the big fish."
Cambridge Analytica: Everything you need to know about Facebook's data mining scandal.
iHate: CNET looks at how intolerance is taking over the internet