Skip to main content Accessibility help
×
Home
Hostname: page-component-564cf476b6-mb7zs Total loading time: 0.264 Render date: 2021-06-22T21:42:26.755Z Has data issue: true Feature Flags: { "shouldUseShareProductTool": true, "shouldUseHypothesis": true, "isUnsiloEnabled": true, "metricsAbstractViews": false, "figures": true, "newCiteModal": false, "newCitedByModal": true, "newEcommerce": true }

Mode Matters: Evaluating Response Comparability in a Mixed-Mode Survey*

Published online by Cambridge University Press:  02 July 2015

Abstract

This paper examines the effects of survey mode on patterns of survey response, paying special attention to the conditions under which mode effects are more or less consequential. We use the Youth Participatory Politics survey, a study administered either online or over the phone to 2920 young people. Our results provide consistent evidence of mode effects. The internet sample exhibits higher rates of item non-response and “no opinion” responses, and considerably lower levels of differentiation in the use of rating scales. These differences remain even after accounting for how respondents selected into the mode of survey administration. We demonstrate the substantive implications of mode effects in the context of items measuring political knowledge and racial attitudes. We conclude by discussing the implications of our results for comparing data obtained from surveys conducted with different modes, and for the design and analysis of multi-mode surveys.

Type
Original Articles
Copyright
© The European Political Science Association 2015 

Access options

Get access to the full version of this content by using one of the access options below.

Footnotes

*

Benjamin T. Bowyer is a Senior Researcher in the Civic Engagement Research Group, School of Education, Mills College, 5000 MacArthur Boulevard, MB-56, Oakland, CA 94613 (bbowyer@mills.edu). Jon C. Rogowski is an Assistant Professor in the Department of Political Science, Washington University, Campus Box 1063, One Brookings Drive, St. Louis, MO 63130 (jrogowski@wustl.edu). The data used in this project were collected as part of the Youth Participatory Politics Study funded by the MacArthur Foundation under the supervision of Cathy Cohen and Joseph Kahne, principal investigators. The authors are grateful to Matthew DeBell, Chris Evans, Ellen Middaugh, and Catherine de Vries for thoughtful discussion and helpful comments. To view supplementary material for this article, please visit http://dx.doi.org/10.1017/psrm.2015.28

References

Aldrich, John H., and McKelvey., Richard D. 1977. ‘A Method of Scaling with Applications to the 1968 and 1972 Presidential Elections’. American Political Science Review 71:111130.Google Scholar
American Association for Public Opinion Research (AAPOR). 2010. ‘AAPOR Report on Online Panels’. Public Opinion Quarterly 74:711781.Google Scholar
Berinsky, Adam J. 2004. Silent Voices: Public Opinion and Political Participation in America. Princeton, NJ: Princeton University Press.Google Scholar
Blumberg, Stephen J., and Luke, Julian V.. 2007. ‘Coverage Bias in Traditional Telephone Surveys of Low-Income and Young Adults’. Public Opinion Quarterly 71:734749.CrossRefGoogle Scholar
Brehm, John. 1993. The Phantom Respondents: Opinion Surveys and Political Representation. Ann Arbor: University of Michigan Press.Google Scholar
Cannell, Charles F., Miller, Peter V., and Oksenberg, Lois T.. 1981. ‘Research on Interviewing Techniques’. Sociological Methodology 12:389437.CrossRefGoogle Scholar
Chang, Linchiat, and Krosnick, Jon A.. 2009. ‘National Surveys via RDD Telephone Interviewing Versus the Internet: Comparing Sample Representativeness and Response Quality’. Public Opinion Quarterly 73:641678.CrossRefGoogle Scholar
Chang, Linchiat, and Krosnick, Jon A.. 2010. ‘Comparing Oral Interviewing with Self-Administered Computerized Questionnaires: An Experiment’. Public Opinion Quarterly 74:154167.CrossRefGoogle Scholar
Chartrand, Tanya L. and Bargh, John A. 1999. ‘The Chameleon Effect: the Perception--Behavior Link and Social Interaction’. Journal of Personality and Social Psychology 76:893910.Google Scholar
Curtin, Richard. Presser, Stanley, and Singer, Eleanor. 2000. ‘The Effects of Response Rate Changes on the Index of Consumer Sentiment’. Public Opinion Quarterly 64:413428.Google Scholar
de Leeuw, Edith D. 2010. ‘Mixed-Mode Surveys and the Internet’. Survey Practice 3:15.Google Scholar
Groves, Robert M. 2006. ‘Nonresponse Rates and Nonresponse Bias in Household Surveys’. Public Opinion Quarterly 70:646675.Google Scholar
Groves, Robert M. Fowler, Floyd Couper, Mick, Singer, Eleanor, and Tourangeau, Roger. 2004. Survey Methodology. Hoboken, NJ: J. Wiley.Google Scholar
Groves, Robert M. and Lyberg, Lars. 2010. ‘Total Survey Error: Past, Present, and Future’. Public Opinion Quarterly 74:849879.CrossRefGoogle Scholar
Heerwegh, Dirk. 2009. ‘Mode Differences Between Face-to-Face and Web Surveys: An Experimental Investigation of Data Quality and Social Desirability Effects’. International Journal of Public Opinion Research 21:111121.CrossRefGoogle Scholar
Heerwegh, Dirk, and Loosveldt, Geert. 2008. ‘Face-to-Face Versus Web Surveying in a High-Internet-Coverage Population: Differences in Response Quality’. Public Opinion Quarterly 72:836846.CrossRefGoogle Scholar
Holbrook, Allyson L., Green, Melanie C., and Krosnick, Jon A.. 2003. ‘Telephone Versus Face-to-Face Interviewing of National Probability Samples with Long Questionnaires: Comparisons of Respondent Satisficing and Social Desirability Response Bias’. Public Opinion Quarterly 67:79125.CrossRefGoogle Scholar
Keeter, Scott, Kennedy, Courtney, Clark, April, Tompson, Trevor, and Mokrzycki, Mike. 2007. ‘What’s Missing from National Landline RDD Surveys?: The Impact of the Growing Cell-Only Population’. Public Opinion Quarterly 71:772792.CrossRefGoogle Scholar
Keeter, Scott, Kennedy, Courtney, Dimock, Michael, Best, Jonathan, and Craighill, Peyton. 2006. ‘Gauging the Impact of Growing Nonresponse on Estimates from a National RDD Telephone Survey’. Public Opinion Quarterly 70:759779.CrossRefGoogle Scholar
King, Gary, J., Christopher Murray, L., Salomon, Joshua A., and Tandon, Ajay. 2003. ‘Enhancing the Validity and Cross-Cultural Comparability of Measurement in Survey Research’. American Political Science Review 97:567583.Google Scholar
Kreuter, Frauke, Presser, Stanley, and Tourangeau, Roger. 2009. ‘Social Desirability Bias in CATI, IVR, and Web Surveys: The Effects of Mode and Question Sensitivity’. Public Opinion Quarterly 72:847865.CrossRefGoogle Scholar
Krosnick, Jon A. 1991. ‘Response Strategies for Coping with the Cognitive Demands of Attitude Measures in Surveys’. Applied Cognitive Psychology 5:213236.CrossRefGoogle Scholar
Krysan, Maria. 1998. ‘Privacy and the Expression of White Racial Attitudes: A Comparison Across Three Contexts’. Public Opinion Quarterly 62:506544.CrossRefGoogle Scholar
Kuklinski, James H., Cobb, Michael D., and Gilens, Martin. 1997. ‘Racial Attitudes and the “New South”’. Journal of Politics 59:323349.CrossRefGoogle Scholar
Luskin, Robert C. 2011. ‘“Don’t Know” Means “Don’t Know”: DK Responses and the Public’s Level of Political Knowledge’. Journal of Politics 73:547557.CrossRefGoogle Scholar
Malhotra, Neil, and Krosnick, Jon A.. 2007. ‘The Effect of Survey Mode and Sampling on Inferences About Political Attitudes and Behavior: Comparing the 2000 and 2004 ANES to Internet Surveys with Nonprobability Samples’. Political Analysis 15:286323.CrossRefGoogle Scholar
McCarty, John A., and Shrum, L. J.. 2000. ‘The Measurement of Personal Values in Survey Research: A Test of Alternative Rating Procedures’. Public Opinion Quarterly 64:271298.CrossRefGoogle ScholarPubMed
Mondak, Jeffrey J. 1999. ‘Reconsidering the Measurement of Political Knowledge’. Political Analysis 8:5782.CrossRefGoogle Scholar
Rivers, Doug. 2009. ‘Second Thoughts About Internet Surveys’. Available at http://www.pollster.com/blogs/doug_rivers.html, accessed 22 June 2015.Google Scholar
Rossi, Peter E., Zvi, Gilulaa, and Allenby, Greg M. 2001. ‘Overcoming Scale Usage Heterogeneity: A Bayesian Hierarchical Approach’. Journal of the American Statistical Association 96:2031.CrossRefGoogle Scholar
Tarman, Christopher, and Sears, David O.. 2005. ‘The Conceptualization and Measurement of Symbolic Racism’. Journal of Politics 67:731761.CrossRefGoogle Scholar
Tourangeau, Roger, and Yan, Ting. 2007. ‘Sensitive Questions in Surveys’. Psychological Bulletin 133:859883.CrossRefGoogle ScholarPubMed
Tourangeau, Roger, and Smith, Tom W.. 1996. ‘Asking Sensitive Questions: The Impact of Data Collection Mode, Question Format, and Question Context’. Public Opinion Quarterly 60:275304.CrossRefGoogle Scholar
Yeager, David S., and Krosnick, Jon A.. 2009. ‘Were the Benchmarks Really Wrong?’. Available at http://abcnews.go.com/images/PollingUnit/KrosnickReply to Taylor.pdf, accessed 22 June 2015.Google Scholar
Yeager, David S., Krosnick, Jon A., Chang, Linchiat, and Javitz, Harold S.. 2011. ‘Comparing the Accuracy of RDD Telephone Surveys and Internet Surveys Conducted with Probability and Non-Probability Samples’. Public Opinion Quarterly 75:709747.Google Scholar
Supplementary material: Link

Bowyer et al datasets

Link
Supplementary material: PDF

Bowyer and Rogowski supplementary material

Supplementary Appendix

Download Bowyer and Rogowski supplementary material(PDF)
PDF 106 KB
5
Cited by

Send article to Kindle

To send this article to your Kindle, first ensure no-reply@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about sending to your Kindle. Find out more about sending to your Kindle.

Note you can select to send to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Mode Matters: Evaluating Response Comparability in a Mixed-Mode Survey*
Available formats
×

Send article to Dropbox

To send this article to your Dropbox account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your <service> account. Find out more about sending content to Dropbox.

Mode Matters: Evaluating Response Comparability in a Mixed-Mode Survey*
Available formats
×

Send article to Google Drive

To send this article to your Google Drive account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your <service> account. Find out more about sending content to Google Drive.

Mode Matters: Evaluating Response Comparability in a Mixed-Mode Survey*
Available formats
×
×

Reply to: Submit a response

Please enter your response.

Your details

Please enter a valid email address.

Conflicting interests

Do you have any conflicting interests? *