Is public opinion too volatile for accurate polling?

Ryan Bridge 10/06/2019

Pollster David Farrar joins Ryan Bridge on Magic Talk Drive to discuss the wide divergence between the two recently released polls.

Results from Newshub-Reid Research poll and the 1 NEWS Colmar Brunton Poll have given conflicting and contradictory results with Newshub predicting a rise in support for Labour and 1 NEWS reflecting the opposite in favour of National.

Ryan first asked David what was happening that could explain such a seemingly wide divergence?

It’s more than seemingly this is a huge difference, 51 percent support for labour compared against 42 in the other, a nine percent difference is huge and way more than the margin of error.

The simple answer is that at least one is wrong.

Ryan pointing out that it could be that both are wrong but asks is there any way to know?

You can’t really tell anything from just one poll so you need to look at the long term trend. Unfortunately even the trends of these polls are going in different directions. One poll has labour up five percent while the other has them down by six percent.

One of them just simply must be wrong.

The other factor might be that they did start at slightly different times, Newshub was just before budget which is an unusual time to start a poll during major event rather than after.

But even if there was a negative reaction to the budget it’s not likely to be nine percent. That represents 300,000 voters change their mind in one week.

Ryan then asks if their might be an issue with the sample these polls interviewed?

David explains that the "challenge is to make sure you are calling the right people, what you want are people who are going to vote. It's easy to get a hold of a thousand adults but you want it to be a thousand adults that reflects the population of those that vote."

To be fair both these polls were very accurate last election, they were both within around 1 percent. Both were very close to each other.

In the US they always have so many polls it’s silly, but certain firms always lean Democrat and certain firms always lean Republican.

Here though the polls have generally been quite accurate.

The number of times you have the two major polling firms disagree to this extent is rare. It’s happened about three times in 20 years that I can remember.

Ryan asks David what might that mean, whether one or both are wrong but is there a lot of swing in opinions?

I will say I have observed in my own work that the electorate is a bit more volatile than it used to be. During the nine years of the National Government the polls were remarkably stable for National they never really varied much.

But since the election we have seen that support is more volatile.

That doesn't explain these particular results but I think it's fair to say things are a little more volatile. Polling is getting harder. It used to be you call a thousand people at home on their landlines, well now fewer people have landlines.

So the Colmar Brunton Poll calls half their sample on the cellphone, the Newshub poll calls a quarter of their sample through an internet panel, so that might be some of the differences too.

New zealand more than any other country actually has a history of polling being very close to the election result. The last time they were really far out was in 1993 when it was a first-past-the-post election.

The difference that makes is that a one or two percent difference in first-past-the-post might mean ten or twelve seats. In our current system one or two percent only means one or two seats.

That tends to mean that those variations, if they are small don’t matter so much.

But a nine percent difference between the two polls is bad, you just have to hope that next time when the polls come out that they are closer.

What would be interesting is that Newshub has a cannabis poll out tonight when 1 NEWS released their poll at midday.

It would be very interesting if they get different results on the cannabis issue as well as the party support issue.

When Ryan suggests that he knows that they do diverge David gets excited and explains “ah well, that suggests that this is a real sampling problem, that fundamentally they did ask quite different groups of people.

You can listen to the full interview above.