RALEIGH, N.C. — Americans are constantly being asked what they think about almost anything. The dirty secret, though, is that, despite answering these survey questions, many people lack a meaningful opinion to report. When this happens, surveys are measuring "non-attitudes."
I’d argue that, instead of revealing meaningful attitudes, many, if not most, people were politely answering the question. Some respondents think it would be rude to not answer, even if they hadn’t thought about it, so they do. Others give answers to survey questions because they believe saying that they don’t have an opinion makes them look stupid.
Most polls don’t provide respondents with an answer option that explicitly states, "I don’t know" or "I have no opinion." Thus, respondents choose an answer from the list of possible answer options they are given. One reason this practice is routine is because these polls are often designed for media consumption. Letting people say they don’t care or haven’t thought enough about a topic to have an opinion would undermine the newsworthiness of the survey results.
Exceptions sometime occur. Recently, the Wall Street Journal reported on what happens if respondents are given the opportunity to say they don’t have an opinion
. Americans were asked how they felt about the U.S. handling of the civil war in Syria and also the rise of the terrorist group ISIS. In both instances, the plurality response was, "I don’t know enough to have an opinion." Notice in my first example that the "don’t know" answer option was not given to respondents. So, even though the topic is similar to the WSJ questions where the most frequent response was "no opinion," almost everyone in the first survey answered the question.
So, how can you tell if surveys are measuring non-attitudes? I teach a class at North Carolina State University called, "Public Opinion and the Media," and we created a checklist to help people determine if a high percentage of survey responses reflect non-attitudes. There is no definitive way to distinguish a real opinion from a non-attitude, but the following consideration should be helpful.
Were respondents asked if they were interested in the topic, or if they had given it much thought about it, before being asking what they thought about it? This is called asking a "screening question." The goal is to prevent respondents who admit to not caring or thinking about an issue from being asked their opinion about it. Removing respondents this way has the undesirable effect of reducing the sample size, and thus increasing the margin of sampling error around the results for opinions expressed by the remaining respondents. It also means the sample whose opinions are measured is no longer random and no longer represents the entire population, but rather a more attentive one. Nevertheless, if people weren’t first screened for non-attitudes, the chances increase that non-attitudes were measured. This is something I've studied in my own research about what happens to opinions about free trade if screening questions are or are not used
Is the topic widely discussed and/or covered by news media? Topics that are rarely discussed in the news are ripe for non-attitudes. The less an issue is covered, the less likely people are paying attention to it. Since exposure to news and conversations are important to opinion formation, their absence makes it more likely people aren’t forming meaningful opinions about that topic. For example, a 2007 study looked at how news coverage affected opinions and measurement of non-attitudes about the Patriot Act
3. Do respondents flock to the middle? Surveys sometimes provide respondents with the opportunity of picking a middle ground option, such as saying they feel “neutral.” Respondents with non-attitudes often choose the mid-point of an answer option scale. For example, if respondents were asked if they favored or opposed raising the minimum wage and were also told they could report feeling neutral about it, those lacking real opinions tend to pick the neutral option. Of course, the midpoint of an answer option scale is also what people might choose if they are deeply conflicted and can't decide, but that possibility happens less often.
4. Do respondents have strong opinions about the issue? It is good practice to not only ask if people support or oppose a policy position, or see if they agree or disagree with a policy statement, but to also ask if they feel that way "strongly" or "not strongly." Respondents reporting non-attitudes rarely feel a particular way "strongly." So, if very few people overall say they strongly support or oppose a policy, it's more likely that many of them don’t really have much an opinion at all.
5. Does the survey influence answers by giving respondents information about the issue? Some survey questions start by providing respondents with background information about the topic before asking them how they feel about it. Even when the information is balanced, such as presenting what both Democrats and Republicans have to say about it, giving respondents information risks informing them about the issue when they were otherwise ignorant about it.
Providing information in the question wording thus makes it easier for respondents who hadn’t heard that information to answer the question. Likewise, telling respondents that a prominent political figure supports or opposes a policy helps respondents form an answer on the spot, because they are responding based on how they feel about that person rather than revealing a deeply held opinion.
Note: I’d like to sincerely thank my students in PS 411 for helping me to refine my thoughts about non-attitudes and craft this list.