Americans Have Collective Amnesia Around Polling

GEN: Lost in a Gallup runs through many of the major polling errors of the last 80-plus years: the Chicago Daily Tribune’s infamous “Dewey Defeats Truman” headline, Jimmy Carter’s perceived lead over Ronald Reagan in 1980, erroneous exit polls in 2004 that suggested John Kerry had unseated George W. Bush for the presidency, Donald Trump’s defeat of Hillary Clinton in 2016.

Given the history here, why do we still act so shocked when a poll proves inaccurate?

W. Joseph Campbell: One of the reasons I think we don’t learn from past mistakes is that polling failure and polling errors are never quite the same, just as no two presidential elections are quite alike. They’re not all like “Dewey Defeats Truman” in 1948. There are other varieties, and among those is the one we saw in 2016: key states in the upper Midwest — Wisconsin, Michigan, and Pennsylvania in particular — swung unexpectedly to Donald Trump. No one really anticipated that kind of outcome because the polls in those states were consistently showing Hillary Clinton ahead. So there’s polling failure at the state level, which really has never had such a profound national impact before.

Our collective memory tends to be short. We don’t know, as a public, much about elections of the past. Therefore, we don’t know much about polling mistakes in the past. That’s partly due to the news media because the news media are always forward-looking. There’s not a whole lot of emphasis on retrospection in either field. Not every election offers a polling surprise, but almost every presidential election has a polling controversy of some kind.

You also talk in the book about journalism’s reverence for so-called shoe-leather reporting. I can think back to 2016, when all the magazines published these huge pieces on Trump rallies and offered this carnivalesque look at Trump supporters. Did such work obscure the polling data?

Interesting question. Nobody knew ahead of time that Wisconsin, Michigan, and Pennsylvania were real candidates to swing to Trump. Those states were characterized in 2016 as being Hillary’s blue wall, a bulwark against Trump and the Republicans. All the poll-based statistical forecasts had Hillary Clinton winning, and some of them had her winning very easily. The Huffington Post polling forecast had her at a 98.2% chance of victory. The Princeton Electoral Commission had Hillary had 99%. I don’t know how much shoe-leather journalism could have overcome that, even if journalists were perceptive enough to say, “Aha, there’s something going on here. Trump might have a chance of victory.” The only person who really said that publicly and clearly was Michael Moore. He said it could happen, and he said as much again in this election cycle.

Yet even after the shock of 2016, journalists aren’t as disdainful of polling as they used to be.

That’s right. At one time, poll-bashing used to be really prominent among journalists, among well-known journalists. They tended to be suspicious of polls as a device to tell Americans what they’re thinking. They thought that was very presumptuous. Mike Royko, the famous columnist at the Chicago Tribune, and Jimmy Breslin, the famous columnist in New York City, were two prominent journalists who really were poll-bashers.

That vehemence has faded away quite a lot in the past 15 years, partly because of the rise of data journalists such as Nate Silver, who in 2008 developed a poll-based prediction model and anticipated correctly the outcome in 49 of the 50 states.

So what do the polls suggest to you in this election?

Yesterday [Sep. 9], Real Clear Politics presented six or eight new national polls. Those polls ranged from a 2% Biden lead over Trump to a 12% Biden lead over Trump. You can pick your poll, depending on which end of the political spectrum you find yourself on, and say, “Yeah, it’s a very close race” or “It’s a near landslide for Biden.” The polls are all over the place.

As a starting point, we can take a look at the Real Clear Politics national average of polls. But even then, that’s not telling us the whole story or nearly so because the national polls don’t always translate into a strong interpretation about what the Electoral College is going to do, and that’s the real key. In 2016, Trump lost the popular vote to Clinton, but he won the Electoral College by winning key states by narrow margins. That means then you’ve got to go to state polls. Sometimes the state polls are not as well-financed or as sophisticated or do the statistical adjustments that national polling does. And so, it becomes a grab bag in many respects.

You have to be very cautious. I think it’s good advice for news consumers to treat polls warily, to recognize that they do have a checkered past, that their track record is not perfect by any means. We should show a bit of skepticism, especially this far out.

So there’s an equilibrium where the public needs to be wary but not scornful.

Yeah, that’s a fair characterization. They should be mindful of the fact that polls can and have gone wrong in the past. They’re not perfect, and there are a lot of polls out there that are showing conflicting readings. They’re all pointing in the direction, but some are narrow and some are wider.

It sounds like the issue is, to a degree, a lack of public literacy around polling.

Yeah, it is. It’s difficult to get beneath the surface with polling: the way the numbers are collected and interpreted, the way the numbers are weighted or adjusted statistically, what the margin of sampling error tells us — that gets to a level of complexity that I don’t think many news consumers are going to want to get into. It can be impenetrable for the average consumer. At the same time, it’s not a bad idea for people to understand that polls are not perfect; they’re not prophecies.

How can the polling process improve?

The whole polling industry, particularly election polling, is trying to find the next gold standard, if you will. There are still pollsters doing telephone interviews and surveys, but many pollsters are getting away from that because it’s so expensive to get a decent sample, and it is very frustrating to try to reach people that way. The industry has been battered by declining response rates in which people do not answer pollsters’ questions or they don’t even pick up the phone when the pollsters call because they think it’s going to be a spam call or some sort of unwanted robocall. Now, whether that has begun to interfere with and distort their results, the jury is still out on that question.

Nonetheless, pollsters are trying to find different ways to tap public opinion. They’re experimenting with internet panels, in which people are recruited to answer questions periodically from polling organizations. Some researchers in public opinion research are looking for social media platforms as a way to begin to define and discern trends in public opinion. Twitter, for example, signals changes in public opinion. They’re trying to figure out the next best approach to conduct their surveys, but no one has settled on the next gold standard.

Leave a Reply

Your email address will not be published. Required fields are marked *