If you follow polls or politics, you’ve probably seen some very different news and numbers in Ontario lately.
Some people are quite surprised at this lack of poll consensus, I am less surprised and wrote about this possibility late in 2017.
We’ve learned a lot about polling error the last 6 months. After our polls in Calgary were way off the mark, we launched an exhaustive investigation that resulted in some very significant findings.
When we released our latest Ontario voter intentions this week, several people noted that our breakout for the 18-34 age group, showed a significant lead for the PCs while other polls, online panel polls, have a very different breakout for that age group. Much calm discussion and fair commentary ensued on social media. Upon the release of another IVR poll (Ekos Research) later the same day that had very similar 18-34 breakouts. I pointed out that in my opinion, all the polls are correct.
Now, that’s not where the story ends. In recent days, we’ve seen several new online panel polls released and much more discussion has continued about the polls. Some are even suggesting that IVR and non online panel polls can no longer capture true voter intentions, and point to the 18-34 breakouts as clear evidence. It happened in Calgary, so could it happen again? The answer is of course, unlikely, but more on why later.
But let’s park the talk of the breakouts for a moment and talk about the topline numbers. There is general consensus about where the Ontario Liberal Party stands. Over the past 3 weeks, there have been 8 polls (4 online panel, 4 IVR) released showing the Wynne Liberals between 26% and 29.3%. For the Ontario NDP it is a different story, from a low of 18.3% (IVR) to a high of 28% (Online). Green Party support is consistent, at between 5% and 7%.
The largest deviation comes in the PC number, from a low of 36% to a high of 50%. It’s important to note that both the high and low PC numbers are IVR polls. The last Angus Reid Institute poll (online) from March 7th also has the PCs at a high of 50% and the Liberals at 24% like Mainstreet found on April 5th. But that is the lone online poll that shows the PCs above 43% in the last month, the two most recent online polls peg the PC support at 40%.
Like many other public opinion researchers, we don’t rely exclusively on a single mode, IVR is our preferred mode for many reasons, but we regularly conduct live agent surveys, online polls and more and more, blended modes. Our own online poll shows PC support at 42% with the Liberals at 21% and the NDP at 20%. The 18-34 part of that online sample has the PCs at 39%, the NDP at 27% and the Liberals at 20%. I will note however that the NDP leads among those 25-34 by 3% over the PCs with the Liberals a sizable distance back in 3rd place. As I have repeatedly mentioned, we’ve been running concurrent online samples with all our provincial and national polls since mid 2015, to measure mode effect.
There is other consensus including numbers on budget reaction, leader favourabilities, regional numbers. But people keep pointing to the differences, especially those age breakouts, so let’s get back to those.
In my opinion, and I fully recognize that opinions vary among pollsters and pundits and many other observers, this is much ado about nothing, especially at this point with the campaign yet to unfold. Poll consensus may yet come by the end of the campaign, and so could breakout consensus. But I want to be clear, I in no way encourage herding and don’t support people who do.
To understand the relevance of the breakouts, let’s look at the last comparable election. That is to me, a provincial election, the BC election of 2017. We all remember the rise of the Green Party and Andrew Weaver, the razor-thin result with a majority government for Christy Clark out of reach by less than 100 votes in a single riding. Then the NDP/Green alliance that took control of the BC legislature. If you follow politics and elections closely, you might even remember that all the polls were right.
What likely very few will remember is the breakouts in those last polls conducted by 6 firms, using a variety of methodologies and a variety of sampling modes. For those who don’t remember, let’s take a closer look.
The last poll released between May 3rd 2017 and May 8th 2017 had the following breakouts among those 18-34.
|Firm||18-34 BC Lib||18-34 BC NDP||18-34 BC Grn||18-34 Other|
|Insights West (Net)||32%||46%||20%||1%|
|Justasson (Net/IVR Blend)||24%||38%||32%||6%|
Now I am a big believer in comparing apples to apples so lets look at each set of like polls together, First, the pure online panel polls. Angus Reid has the NDP at 53% among those under 35 on the high end, and Insights West pegs them at 46%. For the BC Liberals Insights West has them at 32% while Angus Reid has the low pegged at 26%. There is consensus for the Greens but Insights West puts Other at just 1% while ARI put it at 4%.
Among the pure IVR polls, we see even wider deviation with Forum showing the near reverse of Mainstreet. We put the BC Liberals at 43% and the NDP at just 35% while Forum has the NDP at 49% with the BC Liberals at just 27%.
Among the blended sampling polls, Ipsos pegs Liberal support at 33% while Justasson puts it at 24% while they have the NDP at 38% and Ipsos at 42%. Here we see the largest deviation for the Green Party with Justasson showing Greens at 32% and Ipsos puts them at just 19%. (Note these were not identical modes, Ipsos used live agents blended with online panel while Justasson blended IVR with online panel)
Surely in this change election, with this massive deviation among the millennial voters, this would mean some polls would have catastrophic error, but that wasn’t the case in BC, almost all the polls proved correct within their respective margins of error or confidence intervals. The question is how?
The answer is in the deviations in the other breakouts, mode effect, and likely response bias was also a factor. Mainstreet showed the NDP leading in the 35-49 age cohort as well as the 50-64 age group while Forum and others had the BC Liberals leading in those demographics (There are no direct apples to apples comparable data unfortunately due to variations in the way different pollsters report the demographics).
Which brings me full circle back to Calgary and why I suggest response bias is likely a factor as well. I know this will shock some of you, but some people lie to pollsters. In the late days of the Calgary mayoral election, we tried to sample people who were just under 35 to test against our final master sample. We did this with a screening question, asking people their age at the beginning of the survey before asking voter intention. That data set (raw) was published along with all our other data as part of our Calgary review, you can find it here. The age question was also left in the script at the end of the survey, what it revealed was that quite a few of those who told us they were under 35 at the beginning of the survey, revealed they were not at the end of the survey. Or perhaps they lied at the end and told the truth in the beginning. This response bias varies by mode and by a number of other factors.
A more important number to look at in my opinion is the maximum weight in any sample set, that is the ratio between the unweighted frequency and the weighted frequency. Our own latest Ontario sample, like the one for Quebec, has a maximum weight just over 2, our IVR samples that all include an RDD component have lower maximum weights than what we have seen the past 2 years from non-RDD samples.
Like some panel pollsters who suggest that non-contact and non-response bias among those under 35 might affect the accuracy of phone polls, I will suggest that fewer people over 65, and fewer low information voters take part in online panels. Unlike the exclusive panel pollsters, I won’t suggest that any polls are incorrect or inaccurate, it’s why they are called snapshots. We are talking to different sub-sets of voters, in different ways, on different days. The only true test will come election day. It’s likely that what online panels polls miss among senior voters, is made up by their under 35 sample, and vice versa for IVR. My only caution is, there is a real danger in ignoring the voter intentions of lower information voters, as I wrote about in 2016.
Mainstreet has made many changes since BC and Calgary. Staff changes, frame design, script design, methodology and reporting. We’ve addressed every shortcoming that led to the error in Calgary and re-introduced RDD and multi-lingual surveys that we did not use in Calgary. Our panel of expert advisors never once suggested that the mode (IVR/Panel/Live agent/Blended) may have been a contributing factor and I certainly agree. We know exactly why we overestimated one candidate’s support, and underestimated another so significantly.
The only remaining question from Calgary remains why the two online panel polls that were in the field (Including the Asking Canadians panel that is used by numerous pollsters) so significantly missed in the reverse of our poll. Overestimating the one we underestimated, and vice versa. To be clear, this was not a minor miss, the candidate that we had underestimated by 12%, was overestimated by 8% in the panel polls, the candidate we overestimated by 8%, was underestimated by 7% in the panels. The task of figuring out exactly what led to this is the work being currently conducted by very smart panel of experts. As we speak, the MRIA review panel is looking at this mystery. I suspect this, but look forward to reading their report.
In my opinion, the current dispute between pollsters, pundits and observers about the age breakouts is much ado about nothing. However, I think it’s something that bears further scrutiny and am prepared to be proven wrong……on June 7th. This push to discredit polls, modes or specific pollsters ahead of election outcomes is something I disagree with strongly, for reasons that will become clear after the Calgary election panel reports on its findings.
My staff and I will be posting a Mainstreet polling FAQ in the days ahead to address some of the questions that have been raised this week and we hope you will find the information valuable when looking at our work.