In the end, big data won.
Not the presidential election -- although there's no doubt that President Obama's victory tonight was aided by a sophisticated understanding of the American electorate born of years of analysis of voting trends and demographic shifts.
No, big data -- and its patron saint,-- won the battle to predict the outcome of the contest between Obama and former Massachusetts Gov. Mitt Romney. Where breathless pundits brandishing equivocating polls shouted from the rooftops over the last few weeks that the race for the White House was a "tossup," or "too close to call," Silver and other poll aggregators sat back and calmly told anyone who would listen that the math told another story: .
CBS Sunday Morning profile of Nate Silver
To be sure, after the president's dismal performance in last month's first debate against Romney, his prospects dimmed somewhat. But those who regularly visited Silver's New York Times-hosted FiveThirtyEight blog -- and there's no getting around it: many Democrats lived on the site throughout the fall -- knew that Silver never pegged Obama's chances of victory at less than 61.1 percent.
To those unfamiliar with the notion of poll aggregation and more accustomed to gleaning their perceptions of the trajectory of presidential elections by following venerable polling organizations like Gallup, Silver's numbers never made any sense. With a wide variety of polls showing Obama struggling, and often trailing Romney nationally, how could someone who'd never even run a poll credibly tell the world that the president was actually comfortably ahead?
Indeed, critics of the notion of poll aggregating -- a complex system that analyzes hundreds of state and national polls in order to arrive at numbers that focused not on who would win the popular vote, but rather who would take the Electoral College -- increasingly echoed their skepticism as the calendar edged ever closer to November 6. No one more clearly voiced that skepticism than Politico's Dylan Byers who, on October 29, penned an incredulous article titled, "Nate Silver: One term celebrity?" In it, Byers wrote, "more than a few political pundits and reporters, including some of his own colleagues, believe Silver is highly overrated."
Tonight, after seeing that FiveThirtyEight is poised to have correctly predicted the winner in all 50 states, Byers should be considering that maybe, just maybe, Silver knew what he was talking about.
For the Nate-haters, here's the 538 prediction and actual results side by side twitter.com/cosentino/stat...— Michael Cosentino (@cosentino) November 7, 2012
Silver, of course, wasn't alone. There were at least four other prominent poll aggregators -- TPM's PollTracker, HuffPost Pollster, the RealClearPolitics Average, and the Princeton Election Consortium -- and all of them correctly predicted not just that Obama would emerge victorious tonight, but that he would dominate in the swing states of Ohio, Virginia, Wisconsin, Nevada, New Hampshire, Colorado, and Iowa. They all also correctly agreed that Romney would carry North Carolina. But just Silver and HuffPost Pollster went out on a limb and predicted that the president would take Florida, in both cases turning that state blue just one night ago.
Although pollsters like Gallup and Rasmussen Reports maintained until the end that Romney would win the national popular vote, many national polls did predict Obama's victory. And some critics of Silver's methods may point to that as proof that Silver got lucky in the end. As MSNBC's Joe Scarborough told Byers in the Politico story, "'Nate Silver says this is a 73.6 percent chance that the president is going to win? Nobody in that campaign thinks they have a 73 percent chance -- they think they have a 50.1 percent chance of winning."
But it's hard to argue with going 50 for 50 at the state level -- the only measures that really matter in a presidential election. Though Florida, Virginia, and Nevada are still formally too close to call as of this writing, the president is winning in each state.
And to be sure, Silver may not have been quite as accurate when it came to predicting the margins of victory at the national and state levels as he was in 2008. For example, as of this morning, he had Obama winning nationally by 2.5 percentage points, and in Ohio by 3.6 points. As of this writing, the president was winning nationally by just half a point, and in Ohio by two points or less.
Still, there can be little doubt that the methods used Silver and his aggregator brethren -- even RealClearPolitics, which did nothing more than average recent polls -- performed as advertised and gave those who believed in such a system ample reason to count on their accuracy in future elections.
Silver himself may have best summed up the difference between the computational underpinnings of his system and the critics who mocked it. "You may have noticed some pushback about our contention that Barack Obama is a favorite (and certainly not a lock) to be re-elected," he wrote on November 2. "I haven't come across too many analyses suggesting that Mitt Romney is the favorite. (There are exceptions.) But there are plenty of people who say that the race is a 'tossup.' What I find confounding about this is that the argument we're making is exceedingly simple. Here it is: Obama's ahead in Ohio."
And how did he know? Not through irrational belief or blind wishes. But through a painstaking analysis of every poll of the Buckeye State available to him, and 100,000 simulated elections that showed, when all was said and done, that the most crucial state in this year's election, one that Romney could not win without, was not the nailbiter many said it was, but rather a comfortable lead for the president.
Score one for the quants, especially the most famous one of them of all, a statistician who is now, unquestionably not a one term celebrity, but a political prediction machine to be taken very, very seriously.