******************** THIS BLOG HAS MOVED TO WWW.LEGALINSURRECTION.COM ********************

This blog is moving to www.legalinsurrection.com. If you have not been automatically redirected please click on the link.

NEW COMMENTS will NOT be put through and will NOT be transferred to the new website.

Monday, March 7, 2011

Polling 101 - How People Answer Polls

This is the first in a series GUEST POSTS by Matthew Knee, a Ph.D. candidate at Yale University specializing in campaigns and elections, ethnic voting patterns, public opinion, and quantitative and experimental approaches to political science.

____________________

Analyzing polls with only what polling companies release is a tricky business. Near-ideal poll analysis requires a database of actual, person-by-person responses, expensive software, and advanced mathematics. Ideal poll analysis requires actually being the pollster and having an overstuffed budget. However, there are a number of rules, tips, and tricks that anyone - with a bit of logic and a calculator - can use to draw meaningful conclusions from flawed polls and incomplete information.

I will be addressing these issues in three stages. In the first section, I will talk a bit about how people answer polling questions. In the second, I will discuss samples and biases. In the third, I will discuss techniques for evaluating the seriousness of bias.

All-purpose disclaimer: This series will include approximations and simplifications. It is for understanding media polls, not for writing articles for scholarly journals. It is also not exhaustive. The list of specific problems that can arise, especially in poll wording, is, obviously, enormously long.

How People Answer Polls

1. Most People Probably Don’t Think About Politics Like You Do

Most people do not think like bloggers and blog readers. We tend to have more set views on issues and are generally less susceptible (but not immune) to manipulation by bad polls or information. Most people, however, do not have a big chart in their heads listing most significant issues and a set view on each. They probably do have set answers to questions such as “which presidential candidate will you vote for next Tuesday,” but are more likely to provide inconsistent answers about say, cap and trade or even abortion.

Polling is not data retrieval. Rather, it is a process of activating associations and considerations to produce a response. People do have opinions and emotions about particular figures or values, but often do not have consistent views on specific policies. This means that, frustratingly, there is no one correct answer to what certain people think about certain topics – only how answering specific questions about them illuminates the complexities and contradictions within.

This view of polling, pioneered by UCLA’s John Zaller and others, explains the importance of wording and ordering polls correctly.

2. Question Wording And Order Can Significantly Impact Responses Through Priming and Framing

There are many ways questionnaires can be manipulated. Priming is the act of reminding a respondent of an association you want them to consider. For instance, using the phrase "Bush tax cuts" rather than "2005 tax cuts" will likely trigger whatever feelings a respondent has about Bush. An example of priming in popular culture is the beginning of South Park’s “With Apologies to Jesse Jackson” episode. The episode begins with a character about to solve a Wheel of Fortune puzzle for which the N-word is a possible (but incorrect) answer. While he thinks, the African-American cameraman peeks out from behind the camera, filling the screen with an African-American face and priming the audience to, as the character is about to, conclude that the N-word is the answer to the puzzle. (A better implementation would have shown the face before the puzzle)

Framing is the act of telling respondents how to contextualize a question. In one classic framing experiment, researchers found significant differences in college students' willingness to allow a Klan rally on their campus based on whether it was presented as a free speech issue or a public safety issue.

These effects can carry over into later questions. For instance, people respond differently to questions on abortion when they are preceded by questions about women's rights than when they are preceded by questions about religion.

Similarly, question order can force respondents to commit to certain positions at certain times, potentially affecting later responses by those who do not want to look like hypocrites. A Cold War era study found that 37% of respondents were willing to allow Soviet journalists in the United States, but 73% gave that answer when first asked if American journalists should be allowed in Russia.

3. Many Wording Effects Relate To Talking In Terms Of Gains Or Losses

There are a number of consistent patterns in the effects of wording. One important pattern is the difference between gains or positives and losses or negatives. People prefer beef that is 80% lean to beef that is 20% fat, or medical treatments that will save 80% to those that would let 20% die.

People are also more reluctant to take things away than to not begin giving them in the first place. People often don't like being the bad guy. Thus, asking about taking away collective bargaining rights is in some ways a biased wording (and not just because of the loaded word “rights”) compared to asking whether or not public employees should negotiate their salaries. On the other hand, there is a real challenge to producing a media narrative that words the situation any other way.

4. There Ain’t No Such Thing As A Free Lunch – But As Long As It’s On The Menu, People Will Order It

When not forced to prioritize, people tend to prefer cutting taxes and increasing or retaining individual spending items. Combined with the inclination not to assent to negative proposals, this will lead to opposing specific cuts to government even while at the same time supporting smaller government in theory.

This also means that the tough decisions required to rein in runaway spending will usually be unpopular, at least without political cover from a severe crisis paired with bipartisan consensus or extremely unpopular scapegoats. The question facing Republicans is not necessarily “How can we balance the budget in a way that does not cost public support,” but “Given that the Democrats are unwilling to forgo political advantage for the sake of the country, how can we save our fiscal future at the lowest political cost.”

5. Options Provided Sometimes Matter

Pollsters usually provide options for respondents’ answers to each question. Sometimes the manner in which the policy space is divided can influence results. For instance, the recent NYT/CBS poll on public sector unions asks if people prefer balancing the budget by raising taxes, cutting public employee benefits, cutting roads, or cutting education. The pollsters note that a plurality prefer to raise taxes. In dividing spending cuts into multiple options, while only having one tax increase option, the poll creates the illusion that more people back tax increases than spending cuts, when in fact more people opted for the latter.

*   *   *   *

I hope you enjoyed the first part of my polling primer. Come back tomorrow for a discussion of sampling issues.

--------------------------------------------
Follow me on Twitter, Facebook, and YouTube
Visit the Legal Insurrection Shop on CafePress!
Bookmark and Share

9 comments:

  1. Heh, love the heading for item 4--it explains perfectly why polls showed (as leftist constantly crowed) that people supported some individual aspects or the general idea of the healthcare monstrosity but consistently rejected it in its entirety. An apparent contradiction that pundits and pols understood perfectly well, but that when trumpeted by a left-leaning media as "evidence" that the American people supported ObamaCare suggested that . . . well, that people supported it even if they said quite clearly they didn't. Of course, they did themselves no favors by turning that into an "Americans are stupid" meme that simply alienated the very people they were trying to convince.

    This administration never had "a messaging problem," they were asking people to look at a fantasy menu that listed "Free Lunch" at the tippy top in neon letters. People ordered one up, but when they were offered the crap sandwich that was the healthcare monstrosity, they understood that even a "free" crap sandwich should probably not be accepted, much less eaten.

    And yes, enjoyed your primer muchly. Looking forward to the second part on sampling.

    ReplyDelete
  2. Excellent information! Thank you for giving the bloggers and readers some basics that most don't know about polling. I especially like #5 because it is what I've seen all along - since 2008, at least.

    Looking forward to the other parts in your series!

    ReplyDelete
  3. Excellent!

    I look forward to seeing the rest.

    I have always been skeptical of polls unless they release the questions asked. (these days most do not. Years ago it would occur more than now and I believe this relates to the increasing willingness of those in the media to frame their 'reporting' about polls to reinforce the liberal viewpoint held by most in the media.)

    ReplyDelete
  4. I've wanted to be polled, but the few who have called stop asking me questions when they learned I am a conservative. I screen my calls now.

    Sorry to veer off topic, but I'd love to read Prof's opinion of this: Disgusting… Leftists Move to Have Prominent Republican’s Children Expelled From State University

    ...

    ReplyDelete
  5. "Most people do not think like bloggers and blog readers. We tend to have more set views on issues and are generally less susceptible (but not immune) to manipulation by bad polls or information."

    In the interest of creating a more informed citizenry, I call for making Legal Insurrection mandatory reading for the susceptible and easily manipulated. I know it's arguable these are somewhat subjective terms but it's agreeable that a workable criteria would be all Obama voters.

    Great work, Matthew! I look forward to the next installment. It's a great feeling like we're back in school learning about something about which we're actually passionate without any exams or term papers always looming on the horizon!

    ReplyDelete
  6. As most people understand, polls are designed exclusively to sway the opinion of the weak-minded or uninformed:

    "Dang, I never much cared for it, but if the polls say 60% are for it, then I guess it's ok."

    I give no credit whatsoever to any survey, regardless of its source.

    In over 50 years, I have yet to be "polled". Well, there was that one time in prison, but that's a story for another day...

    ReplyDelete
  7. I would be interested in finding out the scientific basis for polls that use "all adults" or "all voters". In short: is it possible to know if these types of models have any basis in fact?

    I can understand polls based upon "likely voters", since real elections are something these polls can be compared against for accuracy. But polls that purport to sample "all voters" or "all adults" - what reason do we have to believe these are accurate reflections of what all voters or all adults think? Over and over again I hear how samples of "all voters" or "all adults" will include more Democrats since (it is claimed) Democrats are less likely to vote. But what evidence supports that claim?

    ReplyDelete
  8. @FuzzySlippers, DINORightMarie, jakee308, Evinx, and LukeHandCool. Thanks!

    @Mwalimu - It all depends on what you are trying to measure. If you are interested in what people in general think, an adults poll will tell you that (and yes, adults polls tend to be a little more liberal than registered or likely voter polls). If what you really want to know is what people who will vote in the next election have to say, then you have to figure out who will vote in the next election.

    One way is to just poll registered voters. It's simple and objective, and is a more accurate depiction of the electorate than adults, since those who are already registered are more likely to vote in the next election than those who are not.

    Another way is to make bolder guesses as to who will vote in the next election, based on personal voting history, demographics, partisanship, and whatever other variables you want. This is a more subjective process, and while you might guess right, you might guess very wrong. However, if done right, likely voter models have the greatest potential for accuracy. Sometimes pollsters will use adults or registered voters until it gets relatively close to an election, so they have time to get a good idea of what they think the electorate will look like.

    ReplyDelete