As we head into 2020, pollsters, along with the analysts and political junkies who consume their polls, hope to never repeat the national dismay that followed the unexpected outcome of 2016. Pollsters have identified many mistakes that year that they hope to correct for in 2020. But the takeaway for some more casual observers that polls are simply not to be trusted is neither accurate nor a useful response to a year that posed some unusual challenges. As many Daily Kos readers know, the national polling was actually pretty darn accurate, especially relative to polling from previous years. Check out the following graph from Pew Research Center.
The real polling problem came at the state level and the good news is, a report analyzing 2016 missteps from the American Association for Public Opinion Research (AAPOR) found that some of those problems can be corrected for moving forward.
The education gap
One of the biggest polling errors came to fruition because pollsters failed to take into account the fact that college graduates were more likely to respond to surveys than those with a less formal education. This is a bias that pollsters can adjust for and the failure to do that particularly skewed many critical state-level polls in 2016. As Pew Research Center writes:
This mattered more than in previous years, when there weren’t big partisan differences between the two groups. In 2016, however, college grads broke for Clinton while high school grads broke for Trump. State polls that didn’t adjust – or weight – their data by education were left with a biased sample.
A University of New Hampshire poll, for instance, gave Clinton a 16-point advantage over Trump just before the election. Clinton ultimately won the state, but by a razor-thin margin. "That poll’s gap would have closed entirely if its analysts had weighted for education, according to the AAPOR report," writes Pew.
The late breakers
Voters who settled on a candidate in the final days of the election also posed particular problems in some states. Many states suffer from a lack of high-quality statewide polls anyway, but even states with decent polling missed the last-minute swing toward Trump because few organizations if any had polls in the field in the final days of the election. The New York Times writes:
In Michigan, Pennsylvania and Wisconsin, between 13 and 15 percent of respondents in exit polls said they had decided in the last week of the campaign. Those voters broke for Mr. Trump by a wide margin; in Wisconsin, it was about 30 points.
Little can be done to account for such a wild swing in the final days of an election, but perhaps the FBI will refrain from making any major announcements in the final week of election cycles moving forward.
“Shy” Trumpers
While voters with a less formal education were less easy to reach, some voters with more education didn't actually want to tell pollsters they were voting for Trump. Writes the Times:
A pre-election study by Morning Consult warned that wealthier, more educated Republicans appeared slightly more reluctant to tell phone interviewers that they supported Mr. Trump, compared with similar voters who responded to online polls.
This problem is harder to account for than simply weighting the results, but pollsters may be able to tease out these voters by employing new techniques in questioning them.
One polling firm that showed Mr. Trump narrowly leading in some of the most inaccurately polled states — Michigan, Pennsylvania and Florida, all of which he won — was Trafalgar Group, a Republican polling and consulting firm that uses a variety of nontraditional polling methodologies.
It sought to combat the shy Trump effect by asking respondents not only how they planned to vote but also how they thought their neighbors would vote — possibly offering Trump supporters a way to project their feelings onto someone else.
An attitude adjustment
One of the final lessons for consumers of polls may be to put less emphasis on viewing them as a crystal ball. “I’d suggest that predicting election outcomes is the least important contribution of pre-election polls," ABC News pollster Gary Langer said. "Bringing us to a better understanding of how and why the nation comes to these choices is the higher value that good-quality polls provide.”
It's a fair point. Even properly weighted high-quality polls are nothing more than a snapshot in time, offering us insights into how voters are thinking about certain issues or candidates at any given moment. That’s why, for instance, even good polls missed the late break toward Trump in the final weeks and even days of the campaign.
In the end, we can only tell so much about the outcome of a race through polling. What polls are generally good at is telling us what is motivating voters, what issues they are prioritizing, and how those factors are effecting a race. So while across-the-board cynicism about polling robs us of some of the important intelligence we can glean from surveys, a little well-placed caution is always wise.