Polling performance in Minnesota in 2012
With the elections concluded and all the ballots counted and certified it’s time to start analyzing the stuff that happened. Some of the stuff that happened was polling. So polling is one of the things that shall be analyzed. Starting in this very post.
But… before I proceed to the super insightful analysis that will come, I need to talk definitions. Specifically, I need to explain what I mean when I use the words Error and Bias. I’ll illustrate the difference with the following example.
A pollster releases two polls; one of them is five points too favorable to one candidate and the other poll is five points too favorable to the other candidate. The average error is five points, but the average bias is zero points.
A different pollster releases two polls, both of them two points too favorable to one candidate. In this case the average error is two points and the average bias is two points. The first pollster was less accurate and less biased, while the second pollster was more accurate and more biased.
Bias is a measure of what direction the errors are in, while Error is a measure of absolute error. Bias, in this context, does not mean a premeditated attempt to skew the numbers, it’s simply a measure of that skew.
With that, the following two tables show the average Error and average Bias in all the publicly released polls for the four statewide races in Minnesota. There are four columns; the “race” in question, the “total” of all the polls, just the polls in the “last 4” weeks of the election, and just the polls in the “last 2” weeks of the election (I used last 2 weeks and last four weeks because these time frames coincided well with when polls were released).
For the Presidential race you can see that the error and bias are both small even when including all of the polls. In fact, bias was almost non-existent in Presidential polling in Minnesota. The polling of the other races didn’t fare as well and benefited from proximity to the election to a much greater degree.
The polling on Photo Voter ID was obviously way off, but I doesn’t think that was something that the pollsters got wrong. Instead, all indications are that there was a rapid collapse of support for the amendment in the last week of the election, which the last two polls released, one by Public Policy polling and one by SurveyUSA, confirmed.
The Error and Bias on the Photo Voter ID amendment match exactly. This means that every single poll that came out on the issue overstated the amount of support that the amendment wound up getting. This confirms the hypothesis offered above.
The Error and Bias in the Senate race was high as well. This is likely due to two separate, but somewhat related, factors. One is that while this race was stable for a long while, Amy Klobuchar started pulling away at the end, something that may not have been fully captured in the polling.
And two, blowout elections like this one tend to have a greater variance in outcomes, relative to the polling, then a close election. And this tends to benefit the candidate doing the blowouting. In fact the same thing happened in the 2006 Senate race.
The other thing that supports the idea of late movement in these races is that the polling in the Presidential question was spot on. If the polling itself was systematically biased you would expect to see this show up in the Presidential numbers as well, but it doesn’t.
What you don’t see in any of the results is the thing that has become gospel in certain circles of the state, that polling in Minnesota is always biased in favor of the Democrats. In 2012, when the polling was biased, it was generally in the favor of Republicans and their amendments.
The polling firm leading the charge in this regard was none other than Mason-Dixon, the StarTribune’s new polling partner. But I’m getting ahead of myself, I’ll get into individual pollster performance in the next post in this series.
Thanks for your feedback. If we like what you have to say, it may appear in a future post of reader reactions.