Polling: A guide to what you need to know when you pick up the phone

Editor’s note: This is the second in a two-part series by Kate Robinson on the use and misuse of public opinion polling today. For an analysis of how polling works and how Vermont media companies and politicians use polling, check out Robinson’s first story, “Everything you wanted to know about polling, but were afraid to ask.”

Elmo Roper, a pioneer in the field of market research and opinion polling.

Elmo Roper, a pioneer in the field of market research and opinion polling.

If you are a reluctant polling participant, knowing what different pollsters are after can make you feel more comfortable about staying on the line or hanging up when the phone rings.
 
Some companies conduct polls to burnish the reputations of the politicians who hire them. Others are interested in monitoring the fluctuations of popularity over a campaign cycle or determining what issues the politician’s constituents find controversial.  
 
Negative polling, the kind that misreports an opponent’s views by purveying misinformation and is intended to alter the public’s perception of a candidate or issue, is on the rise.
 
By contrast, some organizations, like Public Policy Polling (PPP), concentrate on turning out hundreds of polls each election season, and pride themselves on “just-the-numbers, ma’am” accuracy.” 
 
Public Policy Polling advertises their ability to accurately track gains and losses over the course of a campaign: “In an instant, the narrative surrounding the race changed,” they write, in citing one example of their ability to accurately gauge the direction of a candidate’s popularity.
 
Politicians have for years hired polling firms to test the waters by taking a sample of the voting population in order to find out what groups they need to appeal to and what issues they should focus on. But for Castleton’s director, Rich Clark, whose goal is to offer unbiased research, “It would be extremely difficult to work with a candidate who is purposively campaigning without losing credibility.”
 

Rich Clark, founder of Castleton Polling Institute at Castleton State College.

Rich Clark, founder of Castleton Polling Institute at Castleton State College.

A very different approach is signaled in Public Opinion Strategies’ (POS) online brochure: “At Public Opinion Strategies we have developed a culture and approach to winning. For more than a decade, Public Opinion Strategies has been helping political candidates win tough campaigns, scoring hard-fought success in the public affairs arena for some of America’s leading corporations and associations, and giving our clients an advantage in their proceedings.” Their polling can polish the image of the politician who hires them or be “negative polling” intended to distort the record or positions on issues or personal character of the opponent.

The trend nationwide, however, is to use polling to fire up specific voter groups.

In a country that most see as much more polarized, these interest groups are delineated through polling and then wooed to speak to their concerns and to increase turnout among supportive voting blocs, as Thomas Edsall of the Columbia University School of Journalism recently reported in the New York Times.

“In the 2004 election, the Bush-Cheney campaign explicitly sought to polarize voters to increase turnout among potential supporters,” he wrote.

“Matthew Dowd, who ran polling and much of the planning for Bush, explained the 2004 Republican strategy to me just after the election. ‘Most voters looked at Bush in very black-and-white terms,’ he said. ‘They either loved and respected him, or they didn’t like him.’ In the face of this reality, ‘we systematically allocated all the main resources of the campaign to the twin goals of motivation and persuasion. The media, the voter targeting, the mail — all were based off that strategic decision.’”

News reports also cover “public opinion” as registered in surveys that range from standard automated random-digit dialing polls using IVR to the classic live person-to-person telephone survey, also using random-digit dialing.

Whether television, radio or print, media now offer on their websites information on how the polls are done — what, if any, scientific sampling methods are used and what other bona fides are offered by the polling organization, including the full questionnaire and results.

But providing these is more honored in the breach than the observance at this point. Taking a closer look at what these bona fides are will help as the election season unfolds.

Let’s imagine you are taking a poll on polling. Here the questions are followed by some information that might make it easier to answer when the pollster calls.

Do you find poll/survey results newsworthy? (Answers: Yes/ No/That depends/Not sure.)

There’s no right answer to this, but if your inclination is to say “That depends,” you are probably right with the majority. The media are omnivorous purveyors of all kinds of poll results and they wouldn’t report them were they not in demand. Should you wish to know where Vermont ranks in popularity compared to the other 49 states, you can find out. (A recent poll said 15th.) Should you wish to learn about American attitudes on tattooing, there is a Harris Interactive poll that has investigated this.

When are published poll results to be taken seriously?

The simple answer is that public opinion on the issues that affect the country is important and can change and public opinion polling can, if done according to high professional standards, track those changes. Even then, remember that the news may not tell you anything of factual import. This is opinion polling, after all.

If the phone rings and it’s a polling organization calling, are you more likely to say yes to giving your opinion or more likely to say no? (Answer: Very likely to say Yes / Likely to say Yes / Likely to say No / Very likely to say No / Not sure/Don’t know.)

“In the latest Tomkins Poll, results indicate something that is probably not surprising to most Americans, but may come as a shock to the poll-takers themselves: Nine out of 10 Americans despise being asked survey questions.” All right, this is a spoof.

But Americans do have a love-hate relationship with polls. Response rates — the number of people who must be called in order to come up with the desired sample of respondents — have been falling for at least 20 years. “They’re now in the low 20 percent,” reports Richard Clark, of Castleton College. New methods of reaching the public have not resulted in improved reliability of polls, just in more polls.

The long-established Gallup Organization begins its introduction to the methodology it uses by saying: “Public opinion polls would have less value in a democracy if the public — the very people whose views the polls represent — didn’t have confidence in the results. This confidence does not come easily. The process of polling is often mysterious, particularly to those who don’t see how the views of 1,000 people can represent those of hundreds of millions.”

It should be noted that the standard for a national sample has risen to 1,200 to 1,500 today, with 800 considered sufficient for a statewide poll. This is in part because getting any reading on subsamples such as men and women, or different age groups, or Republicans and Democrats requires increases in the margin of sampling error because the number is smaller. The “topline” results, for example, may have a stated margin of error of plus or minus 3.5 percent, but that is based on the full respondent sample. On a subgroup, that may rise to plus or minus 4.5 percent or higher.

Do you consider yourself an informed consumer of poll results?(Yes / No / Not sure)

A Yes response is justified if a) you know that polls, which tell us something about the opinion of a very large population by finding a small sample that is representative of the parent population, depend on random sampling done in a way that ensures that everyone in the class being surveyed has a known chance of being selected. And b) that knowing the “margin of sampling error” is not enough to qualify a poll’s results as valid. Transparency about who paid for the poll, what the full set of questions and responses were and what type of polling standards and practices the polling organization adheres to are basic requirements for a legitimate poll.

And one thing legitimate survey researchers will never do is try to sell you something. That is called telemarketing. According to direct marketing industry estimates, there are now 2 million telemarketers who call the general public to sell products and services.

Lastly, if you get a call from a polling organization, you are entitled to ask and get an answer to the question ‘Who is sponsoring this poll?’ before you answer their questions. In January, Terri Hallenbeck, a Burlington Free Press reporter, reported that she got a call asking for her opinion on whether to relicense Vermont Yankee. When asked who was sponsoring the poll, the caller refused to answer, saying that doing so might influence the answer given. But it is the first of the 11 basic principles of legitimate polling to give that information, as listed under National Council on Public Polls Level 1 Disclosures.

Which of the following types of polls do you consider worth responding to? a) Telephone polls, with a human interviewer; b) Telephone polls using Interactive Voice Response (IVR) technology (responses are made using the phone keypad); c) Online polls; d) Exit polls; e) Push polls.

What’s important here, along with knowing who is doing the poll and how the information will be used, is that only a) and b) are able to provide a true random sample because they use what is called “predictive dialing” so that respondents are not self-selected. There is no value in poll results that come from a self-selected set of respondents. No matter how much fun they are.

Sen. Bill Doyle’s Town Meeting Day survey now in its 43rd year, say upfront that it is “unscientific,” that is, results reflect a sample of voters who are self-selected and thus cannot be relied on to be representative. As a lesson in what this can mean, compare the response to the question about a four-year term for governor asked by both on Doyle’s handout and the Castleton poll. Last year, the Doyle poll (whose results for this year are not complete yet) found 79 percent in favor of the change, similar to previous years’ results. Castleton results: 58 percent for, 25 percent against and 17 percent no opinion.

Should either poll lead to action in the Legislature? Doyle told VPR that the return on his survey question “is an endorsement.” The Castleton results support that, but much less resoundingly. Neither, however, posed any questions about real-life choices that would have to be made if the term were increased.

“Push polls,” as Kathy Frankovic, director of surveys for CBS News, says are not really polls at all, but “telemarketing masquerading as a poll.” There is no attempt to measure opinion by collecting information rather than “pushing” the caller’s opinion and no one will analyze the data. Push polls are also usually very short so that the political campaign or interest group can reach thousands, even tens of thousands, of potential voters quickly, while minimizing the cost. The goal is to spread misinformation that may win votes for the campaign. When you get a push poll call, you may be asked for by name, something a qualified polling organization almost never does. And the poll will not ask for demographic information, which is needed for legitimate research. Push polls will also contain negative information, some of which can be true, but often it is false. (See http://www.cbsnews.com/2100-250_162-160398.html for more.)

As to online polls appearing in your email or on a website, a warning about scam “surveys” appeared recently on Facebook. Facebook alerted its members recently with a notice saying, “Scammers have identified Facebook as an easy place to make money through survey scams. These scams are now rampant and there are too many variations to describe here in detail.” There follows a description of the “bait” offered, starting with extreme — lots of CAPS — headlines and too good to be true offers for iPhones or iPads. Links take you to a facsimile Facebook page where you’re asked to fill in a survey and “Like” the page, further spreading the scam and lining the pockets of the scammer and the marketing company that hired him.

So, is it worth saying yes when a pollster calls, even if it’s an automated “interactive voice” calling?(Answer: Yes / No / Not sure)

Here’s why you might say Yes, according to public opinion researchers: Each year, tens of thousands of Americans respond to poll interviews on subjects of national interest. They do so because they want their opinions heard. A scientific poll or survey is an unbiased way for people to make their views known to each other, to their government, to businesses, to educators and many other institutions. This is one way for average Americans to add their voices to the debate over important issues of our day.

Kate Robinson

Comments

  1. Liz Schlegel :

    Really interesting pair of articles – thank you!

    Last year we got a series of polling calls (purporting to be) from the VT Department of Health asking to speak to an adult in the household – and then, when I answered, stating that they needed to speak to the man of the house. They must have called 6-8 times trying to speak to my husband. Huge waste of resources.

  2. David Usher :

    Here is a BIG problem with telephone polls that ought to be addressed. Every poll I’ve participated in was a call to my published landline telephone number.

    Now, millions of people have abandoned landline phones in favor of wireless phones for which there is no published directory . Many may have done so for economic or other reasons.

    I believe a built-in demographic bias may exist in polls in that people with only wireless phones are not in the polling sample. These folks may represent a different viewpoint on a topic that is missed if they are never polled.

  3. Corrine Ross :

    Polling calls are as unsolicited as telemarketing calls are and they should be regulated as well.

    Personally, I’ve lost faith in these polls so I consider getting such calls a nuisance. And whenever I get one, I report the phone number at http://www.callercenter.com. These politicians wanted publicity, then it’s publicity I gave them. Just that, it was negative.

  4. Gopal Das :

    Phone scams or any other kinds, I think everybody should check Scam Detector, an app that Apple released recently. They have hundreds and hundreds of scams exposed, in several industries. For those interested, the app has an online presence as well: http://www.scam-detector.com

Comments

*

Comment policy Privacy policy
Thanks for reporting an error with the story, "Polling: A guide to what you need to know when you pick up the phone"