Do people lie to pollsters?
Posted May 29, 2014
Raleigh, N.C. — Sometimes people answer survey questions dishonestly. One reason this happens is because respondents hold “socially undesirable” opinions about a topic.
A white person who opposes interracial marriage, for example, probably feels social pressure to hold the opposite viewpoint or risks the consequences of being called a racist. In short, social pressure can cause survey respondents to give an answer that doesn’t match their true feelings.
Not only do people under-report socially undesirable beliefs, but also they over-report behaviors that are socially desirable. As the PEW explains, “research has shown that respondents understate alcohol and drug use, tax evasion and racial bias; they may also overstate church attendance, charitable contributions, and the likelihood that they will vote in an election.”
A recent survey conducted on behalf of the Public Religion Research Institute (PRRI) supports this speculation regarding church attendance.
Their study, though, is worth reviewing because of how it was conducted. PRRI asked “identical questions about religious attendance, affiliation, salience and belief in God on two surveys – one via telephone and the other online – and compared the results.” According to PRRI, Americans report being more religious in the telephone sample compared with the online sample. More specifically, 36 percent of Americans reported attending religious services “weekly or more” over the telephone, compared with 31 percent on the online survey. Likewise, 30 percent of telephone respondents and 43 percent of online survey respondents said they attend religious services “seldom” or “never.”
Why does greater church attendance reported in the telephone sample mean that those respondents were less truthful? The assumption is that honesty in surveys increases when their privacy is maximized. Besides assuring a respondent anonymity, how a survey is administered is the key factor. A respondent who is interviewed in person with the questions and answers spoken out loud, for example, has the minimal privacy. Conversely, a respondent who can read and answer survey questions alone on a computer has maximum privacy.
The potential problem with this study is that pollsters record different results for questions unaffected by social desirability simply because the survey was administered differently. The primary reason this occurs is due to selection bias. Research finds that different kinds of people are easier or harder to reach on the phone versus over the Internet. If personal characteristics affecting response rates are correlated with opinions, we can expect to find different results independent from social desirability effects.
As it turns out, this study is fairly persuasive to me. The online survey was administered by GFK. This company performs surveys for academics like me, and they recruited a massive panel (hundreds of thousands) of people willing to take surveys in exchange for receiving free Internet. Importantly, they recruited their panel just like a random telephone poll sample is generated. GFK randomly called Americans and recruited them to take part as a panel for future surveys. For each future survey, a random draw of panel respondents takes place within the sample. In other words, GFK recruitment strategy largely overcomes possible problems with selection bias that plague “mixed mode” surveys (i.e., ones that use multiple ways of reaching respondents).
Ideally, all respondents in this study would have been recruited at the same time, via the same method. After that, half of the sample would be randomly assigned to complete the survey via the Internet while the other half would complete it over the phone. That kind of randomization would have eliminated the possibility that response bias and not social desirability was responsible for the differences in church attendance.
Nevertheless, this study offers a great example of how social desirability can influence what we think about how people really think and behave.