home

Now You Know, Not All Polls Are Equal

By Big Tent Democrat

A great favor has been done for those who seem not to realize that the not all polls are equal. The key to polling is the likely voter model. But polling method is also important. An Indiana university poll that used SUSA to do its fieldwork with a poll it designed came out, with a strikingly different result than SUSA did with its own poll. Thus while SUSA has Clinton up 16. The alternate Indiana poll has Obama up 5. How could it happen? Turnout models. Polling methods. Definitions of the relevant universe. Etc.

SUSA explains:

SurveyUSA conducted two separate, independent, non-commingled election polls in the state of Indiana in the past week, one for SurveyUSA’s TV-news clients (the TV poll) and one for Indiana University Purdue University Fort Wayne (the University poll).

· The TV poll was conducted using “RDD” sample (Random Digit Dial, purchased from SSI of Fairfield CT), which is what SurveyUSA uses on many, but not all, of its statewide election polls.

· The University poll was conducted using “RBS” sample (Registration Based Sample, purchased from Aristotle, of Washington DC), which is what SurveyUSA uses on many, but not all, of its research for non-media clients.

. . . · TV poll interviewing was conducted over 3 consecutive nights that included a weekend (Friday, Saturday, Sunday). · University poll interviewing was conducted over 3 consecutive nights that included only weekdays (Monday, Tuesday, Wednesday).

· The TV questionnaire was shorter. A TV poll “likely voter” could have answered as few as 8 questions (of the 14 total questions) and be included. · A University poll respondent needed to answer at least 26 questions (of the 51 total questions) to be included.

· The “who will you vote for in the Democratic primary for President of the united states” question was asked 3rd in the TV poll. The same question was asked 18th in the University poll.

. . . · TV poll was balanced to most recently available U.S. Census targets for Indiana adults, on gender, age, race and 4 regions defined at the county level by SurveyUSA. To see which Indiana counties are in which region, go here. · University poll was balanced to demographics of the complete voter file for gender, age and 3 regions, each of which region is a cluster of 3 Indiana Congressional Districts. Indiana CD 1, 2 and 3 are combined into a region labeled Northern Indiana. CD 4, 5, and 7 are combined into a region labeled Central Indiana. CD 6, 8 and 9 are combined into a region labeled Southern Indiana.

Any one of these methodological changes, or any combination of them, may contribute to the differences between certain findings in the TV poll and the University poll.

SUSA puts it name on the TV poll. It does not on the University poll which designed the polling method. The SUSA poll is a SUSA poll. The University poll is NOT a SUSA poll.

< Obama Can End It On Tuesday | Open Left: FL And MI Will Be Seated At Convention >
  • The Online Magazine with Liberal coverage of crime-related political and injustice news

  • Contribute To TalkLeft


  • Display: Sort:
    OBAMA'S FOLLOWERS SKEW THE NUMBERS (none / 0) (#1)
    by PssttCmere08 on Fri Apr 18, 2008 at 02:58:19 PM EST
    Over on Huffington Post, you would swear Obama is already in the White House.  

    RDD is typically a little better (none / 0) (#2)
    by andgarden on Fri Apr 18, 2008 at 02:58:42 PM EST
    than RBS. But I think the really telling this is this:

    The "who will you vote for in the Democratic primary for President of the united states" question was asked 3rd in the TV poll. The same question was asked 18th in the University poll.

    We'll see which one is better, but my guess is the TV RDD poll.

    It is standard theory of polling (none / 0) (#6)
    by Big Tent Democrat on Fri Apr 18, 2008 at 03:01:27 PM EST
    That longer polls are worse that later asked questions are less reliable.

    That is simply standard polling theory.

    I know this from my false advertising litigation work.

    Parent

    Makes sense too (none / 0) (#9)
    by andgarden on Fri Apr 18, 2008 at 03:03:38 PM EST
    Maybe you get bored, maybe the commercial break ends, maybe you need to go feed the cat. I would expect that to be especially true for a robopoll. One of its few disadvantages, actually.

    Parent
    Gallup is long (none / 0) (#19)
    by waldenpond on Fri Apr 18, 2008 at 03:26:41 PM EST
    I was galloped a couple of weeks ago.  Some of the questions were a little contorted.  I had to tell them 'no answer' and why.  It went on and on and they asked how many times you laughed in the last week?, yesterday? and today?  'Now' I said, I'm laughing now.

    Parent
    Is it a secret? (none / 0) (#21)
    by Truth Partisan on Fri Apr 18, 2008 at 03:37:27 PM EST
    Funny reply!
    I live in a sparsely populated state and I was polled by all 7 major national and state pollsters within a few days when the race was close here in '04. Do they (except SUSA) share phone numbers? Would explain similar results.

    Parent
    that can be offset (none / 0) (#27)
    by Chisoxy on Fri Apr 18, 2008 at 04:50:51 PM EST
    I think by randomizing the starting point in the survey. Not sure that was done, but if they specifically listed the question # it was asked in then I doubt it.

    Parent
    indeed, but I also liked (none / 0) (#8)
    by TruthMatters on Fri Apr 18, 2008 at 03:02:55 PM EST
    · The University poll was conducted using "RBS" sample (Registration Based Sample, purchased from Aristotle, of Washington DC), which is what SurveyUSA uses on many, but not all, of its research for non-media clients.

    so if its a non-media client SUSA tends to go with "RBS" so I guess now the question is why?

    Parent

    Personally (none / 0) (#10)
    by Big Tent Democrat on Fri Apr 18, 2008 at 03:04:31 PM EST
    I do not think that is the issue. I think SUSA can not tell you if the University screened for Likely Voters.

    Why? Because they just gathered the raw data.

    Parent

    its not an issue (none / 0) (#14)
    by TruthMatters on Fri Apr 18, 2008 at 03:10:07 PM EST
    why they choose to use "RDD" in one instance but "RBS" in another.

    I mean it looks like the difference is coming from using "RDD" vs "RBS" so I am curious how they choose to use it and when. since they say they do both.

    what factors make them go lets use "RDD" or lets use "RBS"

    Parent

    You miss the point (none / 0) (#16)
    by Big Tent Democrat on Fri Apr 18, 2008 at 03:10:59 PM EST
    Who you call is only part of the equation. Who is a LIKELY VOTER is a completely different equation.

    Parent
    Who you call is a big part (none / 0) (#24)
    by dianem on Fri Apr 18, 2008 at 04:13:03 PM EST
    I have to wonder why SUSA uses RDD on most of it's statewide polls. Logically, it must be because they think it's better. Cheaper wouldn't work well for them if it meant that they were getting less reliable results - their reputation is only as good as their latest results. How to define your sample population is one of the most basic and most challenging questions facing any researcher. It can make the difference between getting reliable results and garbage. If you have a non-representative sample, your results will be garbage no matter how carefully you set up your study.

    Parent
    It could be financial (none / 0) (#23)
    by dianem on Fri Apr 18, 2008 at 04:06:53 PM EST
    They purchase the lists of numbers to call. Survey USA is probably limited to using the lists they purchase for their own surveys. Their clients will have to pay for or provide their own list of numbers, and there may be difference costs associated with different lists.

    This is an interesting story. It had not occurred to me that survey companies would run polls for other people with their equiptment. It makes sense. Equiptment is expensive, so why not rent it out to recoup some of the cost. But it seems that SurveyUSA could be a bit more careful with letting their name get attached to a poll over which they had little control.

    Parent

    I bet it is cheaper (none / 0) (#25)
    by ineedalife on Fri Apr 18, 2008 at 04:27:33 PM EST
    Fewer calls needed to reach the target number if you start with only registered voters.

    As far as impact on quality, I'm sure SUSA knows the track record of the two but I don't.

    Parent

    Thank you (none / 0) (#3)
    by americanincanada on Fri Apr 18, 2008 at 02:58:43 PM EST
    I wish all posters on all site were as diligent as you are when it comes to these numbers. The other poll is being touted all over the tubes today as a SUSA poll.

    Newsweek poll is even more crazy... (none / 0) (#4)
    by stefystef on Fri Apr 18, 2008 at 03:00:32 PM EST
    http://www.newsweek.com/id/132721?from=rss

    I don't know where these people are polling or if there are Obama followers fixing the polls, but they are all over the place.  Even the Newsweek poll has people wanting Al Gore.  I give it no value.  

    I am reading dozens of posts of Hillary supporters heading over to PA and I have read a couple of posts of Obama followers who are not having an easy time promoting Obama to non-believers.

    PA will be more interesting than the pundits are predicting.  

    P.S.  I notice Newsweek is very pro-Obama, anti-Hillary.

    National polling is meanigless (none / 0) (#7)
    by Big Tent Democrat on Fri Apr 18, 2008 at 03:02:33 PM EST
    Please take this discussion to the Gallup poll post.

    Parent
    Interesting how (none / 0) (#5)
    by madamab on Fri Apr 18, 2008 at 03:01:21 PM EST
    SUSA is recognized as the gold standard when it reflects well on Obama, but not when it reflects well on Clinton. ;-)

    Thanks for the clarification.

    They were also conducted on different dates (none / 0) (#11)
    by AF on Fri Apr 18, 2008 at 03:06:06 PM EST
    The TV poll was April 11-13, the university poll was April 14-16.  

    That almost certainly doesn't explain all or even most of the discrepancy, but it may well be relevant (for example, it reflect an initial overreaction to "Bittergate," followed by a return to normalcy).

    Yep (none / 0) (#12)
    by Big Tent Democrat on Fri Apr 18, 2008 at 03:07:49 PM EST
    That is certainly possible.

    Parent
    But (none / 0) (#13)
    by americanincanada on Fri Apr 18, 2008 at 03:09:57 PM EST
    But how likely is it that these two polls would be that far apart and both be accurate?

    Parent
    Not likely (5.00 / 1) (#15)
    by Big Tent Democrat on Fri Apr 18, 2008 at 03:10:16 PM EST
    Well, they're measuring (none / 0) (#18)
    by andgarden on Fri Apr 18, 2008 at 03:14:34 PM EST
    two different universes of voters. We don't know which one is more right. However, the TV polls have been pretty good so far this year.

    Parent
    SUSA: why aren't they all like this? (none / 0) (#17)
    by Truth Partisan on Fri Apr 18, 2008 at 03:14:10 PM EST
    Do we know why more pollsters don't use these methods?
    Are other pollsters changing their methods?
    Why doesn't anyone else seem to care,{creative Class} cough, {media} cough, but just enjoys projecting an Obama win? Truth anyone? (Don't mind me, I'm a Truth Partisan.)

    differences in the questions (none / 0) (#20)
    by DandyTIger on Fri Apr 18, 2008 at 03:30:58 PM EST
    can really make a bid difference in the outcome. You'll hear good pollsters admit that you can really get any number you want out of a poll by how you word a question. I didn't read enough to see the differences there, but it would be interesting to see an analysis of questions in the various polls as well as the different methodologies.

    51 questions??? (none / 0) (#22)
    by dianem on Fri Apr 18, 2008 at 03:52:59 PM EST
    With 26 required to be included? I don't know who they are polling, but I have never sat still through a poll that required me to answer 26 questions, much less 51 (except for the census). After about the 20th question I hang up. I wonder how many people do the same thing? I'm betting that the  people who would sit still through that many questions would have to be either very bored or very committed to their candidate - and we all know which candidate has the most dedicated fans.

    BTD, another point in SUSA's favor.... (none / 0) (#26)
    by Oje on Fri Apr 18, 2008 at 04:44:19 PM EST
    SUSA's "TV-polls" seem to be omnibus surveys. While the separate buyers have input, the multiple interests involved tips the balance of decision-making power to the survey vendor, not the client (this is what works for JD Power's industry awards). Generally, the legitimacy (and profitability) of the survey is augmented by the extensiveness of buy-ins for the survey instrument and data. In short, SUSA has an economic incentive to be less biased and more accurate in their TV polls. Additionally, because (presumably) media outlets will reflect differing local political and economic interests, local political biases are muted (this of course is complicated by national media conglomerations).

    Anyway, knowledgeable social scientists and statisticians - as opposed to account executives (salespeople) - can exercise a greater degree of control over the survey methodology and instrument in an omnibus survey. The account executives go to the media outlets with the commodity they have, not the commodity commissioned by the media buyers. Surveys commissioned by individual clients (CNN/Time, USAToday, NBC, etc>) are subject to greater manipulation because the salesperson will seek to supply a product that reflects the media buyer's perceived demands - that is, a survey that fits the "media narrative."

    At one time, I considered the tracking polls the most accurate measures of the Democratic race. But, the Gallup tracking poll came under scrutiny when its client's poll, USAToday, produced vastly different support. Since that time, the Gallup tracking poll suddenly began to exhibit more dramatic swings in support that mirror the USAToday polls (I think they changed their model, but their is no admission to that effect). In any case, the Gallup tracking poll suffered from the economic pressure and scrutiny of client-commissioned (i.e., media narrative driven) surveys.

    Political-economic pressures could eventually taint SUSA's polls, but for now it seems that SUSA has been able to weather the storm of media narratives and produce quality work. I guess I am just saying that SUSA's former accuracy has some economic incentive. The fact that they felt a need to explain the differences though is a flag for concern. When tracking SUSA's work in the future, you might want to consider whether different political and economic pressures have come down upon SUSA's work. Right now, though, they are still Teh World's Greatest Pollster.