M37 Pollster Calls "Push Poll" Claims Bogus
I don't know if you've noticed or not, but there seems to be a bit of discussion going on over this whole Measure 37 thing. Have you heard of it? Creating some kinda buzz, I'll tell ya! Anyhoo, back in October of last year, before Washington and Arizona voted on clones of the land use measure Oregonians passed in 2004, the Defenders of Wildlife and Isaak Walton League commissioned a poll from Greenberg Quinlan Rosner, a Democratic-leaning but highly respected firm (CEO Peter Greenberg is the 'Democratic' half of NPR's bipartisan polling team), in order to test Oregon's satisfaction with Measure 37 since 2004.
Now, let's not be naive here: both the groups paying for the study and the people conducting it are friendly to progressive causes. And to the extent that progressives aren't usually hardcore property rights libertarians, the expectation is that DoW/IWL were hoping for results that would show buyer's remorse on M37. On the other hand, we're not talking about "internal" polls paid for by a candidate, where the idea is to release just the horse race number, if it's good--sometimes not even the totals, just the numerical gap between candidates. This is a publically released poll, released with the polling company's reputation on the line. So while a healthy measure of skepticism is entirely warranted in evaluating the results of a commissioned poll like this, skepticism must concede to information if that information is forthcoming.
The problem with Oregonians in Action's Dave Hunnicutt is that he has the skepticism, he just isn't interested in having it concede to information. He haughtily dismissed the poll when it was released, telling the Register-Guard:
What Dave is really trying to accuse the surveyors of, I think, is something conveniently defined right under 'push poll' in that link I gave you: Question Order Effect or Question Order Bias. That's what happens when you ask questions in a sequence such that earlier questions potentially color later answers. This is a valid avenue to investigate, and it's one I broached with Ben Tulchin, Associate Director at GQR and pollster for the Measure 37 survey. (For reference, here's the press release {pdf} for the poll.)
The first and most important question related to order bias is where the 'big' question is located. In this case we're talking about the ballot test: a question and response set that mimic an actual initiative or candidate race. So here, we're talking about whether the respondents would pass or reject Measure 37 if they could vote again. Hunnicutt's allegation is that the ballot test result is not credible, because somehow the pollsters "pushed" the response on the ballot test by influencing it with previous questions. In Hunnictt's mind, as in this rather sloppy takedown of the poll findings, the use of some worst-case nightmare M37 scenarios has unfairly tainted the question of whether M37 would pass today.
But there's only one logical way for that to happen, and that's if the ballot test question is asked after the "nightmare" series of questions. If it comes beforehand, how can it be influenced by them? It can't. So I asked Ben where the ballot test question came from. The right answer is "as soon as possible after the screener questions."
Screeners are questions used to make sure you have a valid, randomly selected respondent who is eligible for further, substantive questions. For instance, when creating a poll of "likely voters," you screen registered voters (itself a screening question) by asking them how likely they think they are to vote, and then choose the most likely respondents for your LV sample. Ben assured me that the ballot test was the first question after establishing whether the respondent had heard about M37, and how much they had heard. No information about special cases or consequences came up before the ballot test, precisely to preserve the ballot test question as legitimately neutral.
So there goes that theory; sorry Dave. The Reason columnist offered one more half-baked charge:
But that's just one poll, I hear skeptics say, and it's still done by a Democratic-leaning outfit, no matter how professional. That's why it's handy to have another poll on the subject, just released by 1000 Friends of Oregon, the primary anti-37 advocacy group.
To be honest, I don't think 1000 Friends has been anywhere near as visible as they need to be in this session; OIA people are dominating the hearings in the Land Use Committee. But they did take into account the likelihood that OIA and their supporters would seize any chance they could to shout "Push poll! Push poll!" and parried that assumption by using a Republican leaning pollster, Moore Information. You may remember them as the ones who tried to tell us Saxton and Kulongoski were neck and neck {pdf} at 38% in September. So no one's going to accuse them of slanting left.
And they got roughly the same results GQR found {pdf} three months ago: 61% would favor changing M37--23% would like to see it repealed entirely--while 31% want it left alone. They don't have a straight ballot test measure; they used an "informed ballot" that came after the "what would you like to see happen" question, yielding a 52% No vote against the measure. Moore's press release provides several crosstabs, and states explicitly that it conducted a statewide representative sample of 500 voters, for a 4% MoE. So Moore comes at the issue from a slightly different set of questions but largely replicates the GQR survey, and definitely reinforces it with much fresher data.
Taken together, it's very difficult for OIA and their denizens to refute the claim Oregonians that feel they got more than they bargained for with M37. The Legislature is empowered and has the responsibility to act on that sense of the electorate, and to look at ways to reduce the confusion and create a process in balance. There is a whole range of things that can be done or not done, and in the near future we'll try to address some of the ideas the Fair Land Use Committee might have to chew on, if they can put the freeze on M37 for the duration of the session by passing Senate Bill 505. But don't let anyone tell you that Oregonians aren't amenable to change, or that the polls indicating that amenability are anything like push polls. They're trying to distract you.
Now, let's not be naive here: both the groups paying for the study and the people conducting it are friendly to progressive causes. And to the extent that progressives aren't usually hardcore property rights libertarians, the expectation is that DoW/IWL were hoping for results that would show buyer's remorse on M37. On the other hand, we're not talking about "internal" polls paid for by a candidate, where the idea is to release just the horse race number, if it's good--sometimes not even the totals, just the numerical gap between candidates. This is a publically released poll, released with the polling company's reputation on the line. So while a healthy measure of skepticism is entirely warranted in evaluating the results of a commissioned poll like this, skepticism must concede to information if that information is forthcoming.
The problem with Oregonians in Action's Dave Hunnicutt is that he has the skepticism, he just isn't interested in having it concede to information. He haughtily dismissed the poll when it was released, telling the Register-Guard:
[The GQR release is a]`push poll' that gives 10 or 15 negatives about the results of the measure and then pushes people to respond in a certain way. I could probably do my own poll (with my own examples) and get the same result in the opposite direction.So he's skeptical. But is he right? To get that information, you have to ask. Dave could have asked me what a "push poll" really is, and I'd have been happy to tell him. Start with the fact that actual push polls aren't done with the intent of gathering data, nor are those data released. They're used solely to put outlandishly negative thoughts into the minds of voters, specifically targeted (ie, not randomly sampled). This is clearly not a push poll, since it cannot be denied there was a legitimate attempt to representatively survey Oregonians and return the data to the public.
What Dave is really trying to accuse the surveyors of, I think, is something conveniently defined right under 'push poll' in that link I gave you: Question Order Effect or Question Order Bias. That's what happens when you ask questions in a sequence such that earlier questions potentially color later answers. This is a valid avenue to investigate, and it's one I broached with Ben Tulchin, Associate Director at GQR and pollster for the Measure 37 survey. (For reference, here's the press release {pdf} for the poll.)
The first and most important question related to order bias is where the 'big' question is located. In this case we're talking about the ballot test: a question and response set that mimic an actual initiative or candidate race. So here, we're talking about whether the respondents would pass or reject Measure 37 if they could vote again. Hunnicutt's allegation is that the ballot test result is not credible, because somehow the pollsters "pushed" the response on the ballot test by influencing it with previous questions. In Hunnictt's mind, as in this rather sloppy takedown of the poll findings, the use of some worst-case nightmare M37 scenarios has unfairly tainted the question of whether M37 would pass today.
But there's only one logical way for that to happen, and that's if the ballot test question is asked after the "nightmare" series of questions. If it comes beforehand, how can it be influenced by them? It can't. So I asked Ben where the ballot test question came from. The right answer is "as soon as possible after the screener questions."
Screeners are questions used to make sure you have a valid, randomly selected respondent who is eligible for further, substantive questions. For instance, when creating a poll of "likely voters," you screen registered voters (itself a screening question) by asking them how likely they think they are to vote, and then choose the most likely respondents for your LV sample. Ben assured me that the ballot test was the first question after establishing whether the respondent had heard about M37, and how much they had heard. No information about special cases or consequences came up before the ballot test, precisely to preserve the ballot test question as legitimately neutral.
So there goes that theory; sorry Dave. The Reason columnist offered one more half-baked charge:
Lastly, the fact that the poll report fails to include a detailed methodology—such as the actual survey and a description of the sampling methods—is a strong indication of its inherent bias. National survey firms routinely include this information to make the survey results and methods transparent to readers. How are we to tell if the small sample of 405 Oregon voters polled were genuinely selected at random and are representative of the population? Readers have no way to evaluate quality of the data and whether the sample accounts for geography, income, gender, political affiliation, etc. of respondents. If the pollsters have nothing to hide, then why not release their methods?Technically, the author has a point here--the very best survey releases are ones that include the full survey instrument, detailed methodology, and crosstabs/weighting information. And for research-based polling, those are absolutely required. But private polls don't typically release full transcripts of their surveys in press releases, and that does not by any means render them "inherently biased." The description of methodology reads
From October 12-16, 2006, Greenberg Quinlan Rosner Research conducted a statewide survey by telephone among 405 registered voters in Oregon who voted in the 2004 November general election that included Measure 37 on the ballot. The survey’s margin of error is plus or minus 4.9 percent.Nobody with the credibility GQR has would publish a margin of error calculation without working from a random sample of the universe--Oregonians who voted in November 2004. The survey was statewide, so that answers the geography claim. Question texts are included, although not all of them. The failure to actually print the exact text of the ballot test question is the one thing I would agree should have been done. It's the most important question, and how it was worded is part of the equation. Overall, however, the quality control standards exercised in the survey are rigorous enough to assuage fears of a nonrepresentative sample or question bias.
But that's just one poll, I hear skeptics say, and it's still done by a Democratic-leaning outfit, no matter how professional. That's why it's handy to have another poll on the subject, just released by 1000 Friends of Oregon, the primary anti-37 advocacy group.
To be honest, I don't think 1000 Friends has been anywhere near as visible as they need to be in this session; OIA people are dominating the hearings in the Land Use Committee. But they did take into account the likelihood that OIA and their supporters would seize any chance they could to shout "Push poll! Push poll!" and parried that assumption by using a Republican leaning pollster, Moore Information. You may remember them as the ones who tried to tell us Saxton and Kulongoski were neck and neck {pdf} at 38% in September. So no one's going to accuse them of slanting left.
And they got roughly the same results GQR found {pdf} three months ago: 61% would favor changing M37--23% would like to see it repealed entirely--while 31% want it left alone. They don't have a straight ballot test measure; they used an "informed ballot" that came after the "what would you like to see happen" question, yielding a 52% No vote against the measure. Moore's press release provides several crosstabs, and states explicitly that it conducted a statewide representative sample of 500 voters, for a 4% MoE. So Moore comes at the issue from a slightly different set of questions but largely replicates the GQR survey, and definitely reinforces it with much fresher data.
Taken together, it's very difficult for OIA and their denizens to refute the claim Oregonians that feel they got more than they bargained for with M37. The Legislature is empowered and has the responsibility to act on that sense of the electorate, and to look at ways to reduce the confusion and create a process in balance. There is a whole range of things that can be done or not done, and in the near future we'll try to address some of the ideas the Fair Land Use Committee might have to chew on, if they can put the freeze on M37 for the duration of the session by passing Senate Bill 505. But don't let anyone tell you that Oregonians aren't amenable to change, or that the polls indicating that amenability are anything like push polls. They're trying to distract you.
<< Home