Wednesday, November 11, 2020

That Poll-Asked Look, Con't

NY Times Upshot pollster Nate Cohn talks to The New Yorker's Isaac Chotiner about what the polls got wrong this year compared to 2016, and it's a long, long list, and a fair amount of "I don't knows" in that list.


Isaac Chotiner: In a lot of preelection polling, Trump was running ahead of Republican Senate candidates. But Republicans seemed to do better than Trump in the generic House vote, and in some of these Senate races. Why do you think that was?

Nate Cohn: I don’t know why that turned out to be the case. I think one of the most interesting parts of the polling error this year is that it was greater down-ballot than it was as at the top of the ticket, and in 2016 it was the opposite. And so while a lot of our explanation for what went wrong in 2016 was searching for things that were mainly about the President, I’m not sure that the pollsters would be right to suppose that this year’s polling error is unique or specific to the President.

IC: In 2016, the national polls were off a couple points, and then Midwestern polls, especially polls that didn’t weight by education, were off more. This year with your polling for the New York Times, with other polling, there were a lot of misses. And these were the polls that really took weighting by education seriously and took a lot of steps after 2016 to fix errors, and in your case nailed the 2018 midterms. Do you know what happened this time?

NC: I don’t. I can offer you some theories.

IC: Yeah, please.

NC: But, before I say that, I do want to agree that this was a much bigger polling miss, in important ways, than in 2016. It was a bigger polling miss in the national surveys. It was a bigger polling miss for the industry’s most prominent and pricey survey houses. The state polling error will be just as bad, even though, as you mentioned, many state pollsters took steps to increase the number of white voters without a degree in their surveys. And state polls look a lot like they did in 2016.

But, if the state polls are just as bad as they were in 2016, despite steps that we know improved the President’s standing in the surveys, we can say with total confidence—and I know this was true in our data—that the underlying survey data has to be worse than it was in 2016. Or, if you prefer, if all the pollsters were using the 2016 methodology, the polls would have been far worse this year than they were in 2016. And that is really interesting. As I said, I can list a bunch of theories for you.

IC: Yeah. What are your theories?

NC: Well, the key framing is what’s changed since 2016. What would make the polls worse now than they were then? So one possibility is that it’s four more years of Trump, and that as American politics grew more and more defined along the lines of your attitudes about the President, and as old political allegiances sort of fell away, that non-response bias in polling became more and more correlated with Presidential vote choice.

Another possibility is that “the resistance” is what broke the polls. Think of all of the political engagement on the left, the millions of dollars that were spent to help Jon Ossoff in 2017 or to help Jamie Harrison in 2020. This portrays a tremendous increase in the level of political participation on the part of progressives. We know that politically engaged voters are more likely to respond to surveys. And so it may be that as the Trump Presidency has totally energized the Democratic base, it has also led those same kinds of voters to increase their propensity to respond to political surveys.

Another possibility is the high turnout this year. We in the polling world have tended to assume that higher turnout makes polling easier, because we think of turnout as something that’s an additional variable that the polls have to get right beyond just taking a nice sample of the population. This year, though, we have this huge increase in turnout, and most people have supposed that it was good for Joe Biden. Maybe it was good for Joe Biden. But I think we also have to be open to the idea that it was not good for Joe Biden. In Florida, where we were collecting turnout data live on Election Day, I can tell you with certainty that the electorate was more Republican than it was in 2016, more than our polls projected, no matter your likely voter methodology. That may be true elsewhere in the country. I don’t know. It may not be. We just don’t have that data yet.

But I would note that many of the late polls stopped showing a gap between the preferences of registered and likely voters, and in some cases it went into reverse, where the Democrats were faring better among likely voters and registered voters. A few late examples—the CNN poll showed Biden up ten in Pennsylvania, but only up five among registered voters. I believe the final ABC News/Washington Post poll in Pennsylvania showed Joe Biden doing better among likely voters than registered voters.

And then a final thing I would raise is that maybe it was the coronavirus. You may recall that, one year ago, at the time we published a series of polls that showed Biden narrowly ahead of Trump and Elizabeth Warren trailing Trump, and those polls were a lot more accurate than the polls that we did since then
.
 
There are a lot of factors, but the big one was education. Education, not race, is now the single biggest predictor of partisan voting, but race is still a close second.

And Democrats did badly with Latino men and White college-educated women in particular. The polls had support for Biden from these groups overestimated by double-digits at the minimum. 

But the major problem is this: you can't fix voter suppression with turnout. You just get turnout that favors the suppressors.

No comments:

Related Posts with Thumbnails