The Virginia Democrat says he's hoping people will learn this issue isn't going away.
To Sen. Mark Warner, it's not a question of whether Congress will regulate the tech industry but when.
The Virginia Democrat, who's the vice chair of the Senate Intelligence Committee and one of the more outspoken lawmakers on tech issues, told Facebook COO Sheryl Sandberg and Twitter CEO Jack Dorsey that he's not convinced the tech industry can tackle election interference and bad behavior on their own.
"The era of the wild west in social media is coming to an end," said he said during testimony Wednesday. "Where we go from here is an open question."
Warner isn't the only lawmaker who believes this. Increasingly more members of Congress are raising the specter of regulation, though it's still unclear what that would look like. Warner, for his part, has offered several policy prescriptions, such as forcing companies to make it easier for users to download and bring their data from one social network to another. He's also pitching The Honest Ads Act, which would require tech companies to disclose who pays for political ads, just like radio and television stations already have to.
Warner said in an interview he was also pleased his colleagues stepped up their game, coming to the hearing far more prepared than their colleagues did in April.
"It wasn't stupid questions, it wasn't 'Internet 101,' so I think at least from the committee standpoint, I think we feel like we're moving the ball," he said.
He was also disappointed with Google , which didn't attend the hearing, despite an invitation being sent to its cofounder and parent company's CEO Larry Page .
"Google made a huge mistake by not attending our hearing," he said. "And all that will do is simply raise questions about certain areas beyond even Russian interference, that people want to ask questions on.
Here are edited excerpts from our conversation with the senator after Sandberg and Dorsey's testimony concluded this week.
Over the past couple months, the tech industry has banned hundreds of accounts it says are part of election influence campaigns. Is it enough?
Warner: Facebook and Twitter, which has been very slow initially to respond to the Russian attack, actually, I think got religion now and they are moving forward.
The question of whether they're doing enough, they're doing more, but this is not a static environment. I think the government is doing more, the tech companies are doing more, we're getting better. But at the same time, the adversaries are getting better.
This is both cheap and effective. So it's a problem, it's not going to go away and it's not going to get to a point where we can say, OK, we are now 100 percent secure.
Is it ever fixable? Or is it constant whack-a-mole now?
Warner: There will always be a whack-a-mole component. We may not get to 100 percent protection. But clearly, in 2016, we had virtually no protections.
And in many ways, for example, on the election systems, they were so vulnerable, frankly I think the Russians were amazed that our election systems were so open. And maybe it was just our good fortune and some political pressure put on by the then Obama White House that they didn't take more advantage of that vulnerability.
So what are we aiming at here? That it goes up and then is taken down in a certain amount of time?
Warner: We're aiming at trying to educate the public that this is: One, an ongoing problem. That two, it's not a witch hunt or fake news; this is a real foreign country trying to intervene or most basic democratic process -- with a goal to try to in the past to help Mr. Trump, but on a going forward basis, it is to sow division.
So, a more informed public an electorate about a threat. Also, a more informed public and electorate about not believing everything they see in a social media account, or on Facebook, on Twitter, or a Google search.
Some people are pushing the tech companies to follow First Amendment-like rules. But if you do that, you can't police issues like hate speech. How do you deal with that?
Warner: That's a healthy debate.
We have First Amendment rights in our country, but you can't scream fire in a crowded theater. So, I think even the most zealous advocates of free speech, would realize there has to be some guard rails so you don't scream fire, you don't put up a sign that says go kill your neighbor, if he or she's a Muslim.
Further up the food chain, do you allow a site to go out and say that Sandy Hook was all a hoax and I'm going to go ahead and print the parents addresses so that they are all harassed and threaten violence and many of these families have had to move and change their identity? You know, it's not as clear cut. But, clearly they feel like the platform companies have thought some of that messaging was over the top. Now, that's where that's where we'll have a debate and of course, we won't get it 100 percent right at first, but there will at least be some innovation going on.
Should the government be making these calls or the companies?
Warner: I don't think we can completely rely upon simply the goodwill of the corporate shareholders or the management of these companies.
Because these companies have got such massive market domination, right now, I'm hesitant again, to intervene and possibly cut off American innovation, because right behind them, as we know, not sure most Americans realize, there are Chinese equivalents who are rapidly approaching the same size and scope and they will come with no protections. So that's why I'm more willing to focus on changes like price transparency, like data portability, that really don't get us into the First Amendment realm.
Cambridge Analytica: Everything you need to know about Facebook's data mining scandal.
iHate: CNET looks at how intolerance is taking over the internet.