Nieman Foundation at Harvard
HOME
          
LATEST STORY
What journalists and independent creators can learn from each other
ABOUT                    SUBSCRIBE
April 29, 2013, 12:10 p.m.

Public opinion polls do not always report public opinion

Sociologist Herbert Gans says the news media should do a better job noting that “polls are answers to questions rather than opinions,” and that not all opinions have the same intensity — or the same impact.

town-hall-meeting-voting-citizen-herbert-gans-cc

Editor’s note: Herbert Gans is a professor emeritus of sociology at Columbia University and the author of, among other works, Deciding What’s News. Read his previous essays for Nieman Lab on journalism and democracy here and here.

Polls have long been newsworthy, but never more so than when their conclusions can be compared to contrary politician behavior, the recent gun control debate being a particularly dramatic example. The pollsters’ finding that 90 percent of their respondents said they favored universal background checks for guns was juxtaposed (except by Fox News) with the Senate’s filibustered rejection of such legislation.

More interesting and important, the news media turned poll respondents’ answers to pollsters’ questions into the expression of public opinion. In effect, the news media, and later many politicians, including President Obama, seemed to imply that the Republicans refused to listen to vox populi. Some may even have been thinking that the polls were sometimes a better instrument of American democracy than its elected officials.

In one respect, the polls are more democratic; they report the opinions of a random sample of the entire population, while elected officials have been chosen by an electorate which at best includes 60 percent of the eligible voters and at worst many fewer. Thus, when 90 percent of poll respondents agree on the answers to polling questions, the polls are sending a message about majoritarian democracy.

In other respects, however, polls are not the best representative of the popular will, for people’s answers to pollster questions are not quite the same as their opinions — or, for that matter, public opinion.

The pollsters typically ask people whether they favor or oppose, agree or disagree, approve or disapprove of an issue, and their wording generally follows the centrist bias of the mainstream news media. They offer respondents only two sides (along with the opportunity to say “don’t know” or “unsure”), thus leaving out alternatives proposed by people with minority political views. Occasionally, one side is presented in stronger or more approving language — but by and large, poll questions maintains the balanced neutrality of the mainstream news media.

The pollsters’ reports and press releases usually begin with the asked question and then present tables with the statistical proportions of poll respondents giving each of the possible answers. However, the news media stories about the polls usually report only the results, and by leaving out the questions and the don’t knows, transform answers into opinions. When these opinions are shared by a majority, the news stories turn poll respondents into the public, thus giving birth to public opinion.

Normally, the news story tells what proportion of that public favors the legislation being questioned or rejected by the Beltway politicians. Indeed, such polls are newsworthy in large part because the reportage is framed as a conflict between majoritarian opinions and politicians’ rejection of the popular will.

To be sure, poll respondents favor what they tell the pollsters they favor. But still, poll answers are not quite the same as their opinions. While their answers may reflect their already determined opinions, they may also express what they feel, or believe they ought to feel, at the moment. Pollsters should therefore distinguish between respondents with previously determined opinion and those with spur-of-the-moment answers to pollster questions.

However, only rarely do pollsters ask whether the respondents have thought about the question before the pollsters called, or whether they will ever do so again. In addition, polls usually do not tell us whether respondents have talked about the issue with family or friends, or whether they have expressed their answer cum opinion in other, more directly political ways.

In fact, respondents incur no responsibilities with their answers, no subsequent obligation to vote or do anything else. Conversely, politicians can lose the next election with a vote that angers their base.

If poll results can be interpreted as opinion, they are pollster-evoked or passive opinions. They are not the active opinions of citizens who feel strongly about, or participate in some way in the debates about forthcoming legislation or a presidential decision.

Elected officials may take passive opinions into account but they pay far more attention to active opinions. Above all, however, politicians listen most closely to the usual suspects with power: influential citizens, Congressional leaders and whips, lobbies, and campaign funders.

Jennifer Steinhauer of The New York Times was right on target when she described the poll results as an expression of “national sentiment,” which she then contrasted with the Senate’s “political dynamic.”

Some corrective fixes

Since polls will continue to be used as indicators of public opinion, the news media should be adding some context to their reporting of the results. From time to time, they should remind the news audience that polls are answers to questions rather than opinions, just as they now remind audiences of the polls’ error margins.

In addition, the pollsters should be urged to pose and report intensity questions, telling the politicians and the public how strongly respondents feel about what they tell pollsters, and whether they have been politically active in behalf of these feelings.

At the same time, the news media should keep track of other kinds of intensity measures. For about 30 years, the Pew Research Center has been reporting what news stories a national sample says it follows very closely. Some respondents may exaggerate that closeness, but not many stories are followed closely by more than 50 percent of the sample. Over the years, stories that touch people emotionally and personally relevant ones have always scored highest.

In 2012, the Sandy Hook tragedy was followed very closely by 57 percent, and rising gas prices by 52 percent. In late January 2013, the gun control debate reached a high of 42 percent and stood at 37 percent in early April. The debates over the debt limit and immigration were followed very closely by just under 25 percent of the Pew sample, but 63 percent followed the Boston Marathon bombing very closely.

Better ways the news media can put the passivity of poll opinions into context include the following:

  • Report news about active citizen expressions of opinion, at local town halls, organized debates, demonstrations, teach ins, and the like. Gatherings involving predominantly adult and older mainstream Americans are particularly important; and some politically conscientious websites could be counting and reporting the number of such active expressions, large and small, all across the country.
  • Keep track of the number, content, and tone of phone calls, letters, and other communications to elected officials, particularly those directly involved in an issue. Spontaneous communications have priority over organized ones, notably the now ubiquitous petitions requiring only single clicks on a website.

    In fact, the mainstream news media, journalistic websites, and other enterprising fact-finders should regularly be asking elected and appointed officials about communications and visits from citizens on currently debated political and social issues.

  • Plan followup stories after legislation dealing with major problems and issues has been approved or disapproved. Such stories are already being reported, but for the purpose of putting poll results in context, they should emphasize what citizen communications politicians received and try to find out which ones they took into account.

    Regular reporting of such stories would add to public understanding of which kinds of citizen participation and active opinion the politicians consider. That would also help people understand the place of polls in democratic politics, and perhaps lead to debates about whether they can or should play a larger role in politics. Such debates might even stimulate journalistic and other discussions of the pros and cons of majoritarian democracy.

Photo by Mark Sardella used under a Creative Commons license.

POSTED     April 29, 2013, 12:10 p.m.
Show tags
 
Join the 60,000 who get the freshest future-of-journalism news in our daily email.
What journalists and independent creators can learn from each other
“The question is not about the topics but how you approach the topics.”
Deepfake detection improves when using algorithms that are more aware of demographic diversity
“Our research addresses deepfake detection algorithms’ fairness, rather than just attempting to balance the data. It offers a new approach to algorithm design that considers demographic fairness as a core aspect.”
What it takes to run a metro newspaper in the digital era, according to four top editors
“People will pay you to make their lives easier, even when it comes to telling them which burrito to eat.”