Sen. Mark Warner of Virginia wants top tech companies to get more serious about ridding their sites of Russian propaganda.
The top Democrat on the Senate Intelligence Committee will be grilling executives from Facebook, Google and Twitter in a hearing Wednesday. He'll be looking to better understand how foreign agents working on behalf of Russia were able to spread disinformation on these social networks during the 2016 presidential campaign. The execs will be testifying at three separate congressional hearings Tuesday and Wednesday.
Warner, a former technology and telecommunications investor turned politician, has emerged as one of the harshest critics of the companies. Earlier this month he introduced the Honest Ads Act, a bill requiring digital platforms with more than 50 million users to follow the same rules for online political ads that already cover ads sold on TV and radio. That means companies will need to disclose who's paying for such ads.
While critics describe the legislation as more bark than bite, Warner says it's an important step in forcing companies like Facebook and Twitter to take more responsibility in fighting online propaganda and political advertising from foreign governments.
Watch this: Russia still using Twitter to divide America
Facebook, Google and Twitter have acknowledged that more could have been done to prevent the placement of such ads on their sites. Facebook has said that more than $100,000 worth of ads were bought by what now appear to be Russian agents. Moreover, it said that more than 126 million people in the US saw Russian-backed content during the election. Twitter discovered 2,752 accounts that may be tied to those same agents. Google also reportedly found that Russians paid tens of thousands of dollars for ads on YouTube, Gmail and Google search.
CNET interviewed Warner by phone last week on two separate occasions. Below is an edited transcript of the conversations in which he acknowledges the limitations of his proposed legislation and warns that big companies like Facebook and Twitter need to be doing more to address the issue of Russian interference in US elections.
One of the biggest criticisms of the legislation you introduced is that it only targets paid political advertising on social media. But it does nothing to combat the problem of "fake news," which is often disguised as legitimate.
Warner: They work hand in glove. You may have ads that point to a site, and that site might be a fake site. And if the site or account is fake we might not scoop that up in this legislation. What we're trying to do is provide the lightest touch possible that is not going to slow innovation and invade anyone's First Amendment rights. But you are right that this in and of itself will not solve all problems.
There is also a reasonable expectation that the platform companies will try to determine if this content is being paid for by foreigners. We get at this partially [in the legislation] by saying the companies have to make reasonable efforts to discover if the advertising is being paid for by a foreign entity.
There's already a prohibition against foreign money interfering in elections. So this wouldn't be a foolproof method, but there'd be the expectation that the social media companies would make a good-faith effort.
But in the 2016 election most of the more harmful activity wasn't from ads that people paid for but from sites writing things that just weren't true.
Warner: There are clearly accounts where someone says "I am Mark Warner," but they've misrepresented who they are. A company like Facebook is already trying to eliminate those fake accounts. Twitter has been more willing to allow some fake accounts. So again we're not going to sweep everything up, but I do think this might help shed light on the advertising people see when they "like" certain pages or "like" certain groups.
What I've been learning is that you can wreak a lot of havoc with relatively small amounts of money, 50,000 internet bots, and 40 paid hackers who can create fake accounts.
Here's a classic example that I've been talking to my Republican colleagues about. There was the fake Twitter account for the Tennessee Republican party called TEN_GOP. People probably should have realized it wasn't created by an American because it abbreviated Tennessee as TEN rather than TENN. Anyway, it had 136,000 followers. Meanwhile, the actual Tennessee Republican party had only 13,000 followers.
That gives you an idea of the scope of what we're talking about here.
Do you think this legislation will stop that? Again, this is not overt political advertising. This is propaganda that hasn't been paid for in the traditional sense.
Warner: Yes, you are right. [The legislation] is not going to fix everything. But I don't see how anyone could legitimately argue that the same rules that apply to TV and radio shouldn't apply to digital. And how could anyone defend the notion that Americans shouldn't know if there is paid foreign ads popping up on their newsfeeds?
But you are right. That doesn't get to the question you raised of the false accounts. That is a huge part of the problem. And that is where we need cooperation from the platforms.
Facebook and Twitter have already promised more transparency around advertising on their sites. Twitter said it would disclose who buys political ads and banned two Russia-based media organizations from purchasing advertisements. Is that enough?
Warner: I think they're all moving in the right direction. But none of them have been doing enough. Frankly, their initial reaction to Russian involvement has been a bit embarrassing. They underestimated how serious the problem was.
The proof will be in the pudding in terms of what they say and present on Nov. 1 at the hearings.
What are you hoping they'll say?
Warner: I'm interested to see how forthcoming they'll be. Some of these fake accounts had literally hundreds of thousands of followers. So the idea that it was only 3,000 ads on Facebook isn't by any means a complete enough picture of the extent of the problem.
It looks like these companies are promising more transparency to avoid government regulation. Will that work?
Warner: We have proposed what we think is the lightest touch possible in terms of regulation to move the bar. We welcome their efforts to try to get this right. But the operating principle so far from these companies -- that they are above it all and bear no responsibility for any content on any level -- just doesn't wash anymore.
This is not the first time we've had these debates. We had it when there was child exploitation going on over the internet, and we had it when there was advocacy of terrorist and hate propaganda on the internet. This is an evolving field. But the message we want to send is that these companies need to be in the game.
So are you saying they can't hide or try to duck out of taking responsibility for helping solve these issues?
Warner: I think there will be some that try to duck. But for companies that depend upon the confidence of their users, I think they do it at their own peril.
Does this mean that if you don't see them taking responsibility and action, the government will step in with regulation?
Warner: I think their own users will say, "Why would we trust them at all?" Just look at what's happened in Estonia. The Russian intervention efforts were so extensive there that, my understanding is, the Estonian public and government rose up and said we have to have a new system. Now they have personal identifier numbers on social media accounts. And the public has been trained to spot Russian propaganda.
I don't think that is the right solution here. But I hope these companies see it is in their own best interest to make sure they don't lose the confidence of their trusted users. This is their chance to prove whether they stand by their own credos.
Do you think citizens also bear some responsibility to be better consumers of news? In other words, maybe you shouldn't believe everything you read on the internet and Facebook and Twitter.
Warner: I agree with you. I do think that citizens need to be more discerning. They need to be better curators of their own news. If something seems too outrageous maybe that requires a second look.
But it is hard sometimes when you've got some innocuous-sounding group that you think you can trust. What we've seen is that in many cases there were some of these fake accounts where they create whole personalities. They write blogs; they comment on other subjects. They lure you in often on issues that may not even be related to politics, and they gain followers' trust. Then they start populating the stories with political positions. We've seen some that started with cooking, food and a whole host of other ideas that have nothing to do with politics, where the accounts were used to gain trust and then used to abuse that trust.
How do you prevent foreign interference on social media platforms but also make sure First Amendment rights are protected? A lot of advocacy groups use these platforms, too.
Warner: The internet is and should be a free and open forum. I don't want anyone who legitimately identifies himself or herself as an advocate to not be heard. If you want to advocate for unpopular views or you want to advocate for views in favor of other foreign interests, you ought to be able to do that. But we should also be able to distinguish between those individuals advocating, and those who are presenting themselves as Americans but who are actually foreign agents. The public has a right to know. There is not a First Amendment right that extends to foreigners who misrepresent who they are in an American political debate.
President Trump recently threatened via a tweet to have the Federal Communications Commission revoke broadcast licenses of news organizations he believes are reporting fake news. Does that concern you?
Warner: The president's tweetage concerns me on a regular basis. You've seen FCC commissioners and others reject those musings. The good news is that there may only be one person in the administration that refuses to acknowledge the Russians interfered in our electoral process and the spread of disinformation. The problem is, that one person is the president.
Correction, 10:58 a.m. PT: This story has been changed to note that 126 million Facebook users saw Russian-backed content, and not just ads.