The Effect of Survey Mode on Data Quality: Disentangling Nonresponse and Measurement Error Bias

Open access

Abstract

More and more surveys are conducted online. While web surveys are generally cheaper and tend to have lower measurement error in comparison to other survey modes, especially for sensitive questions, potential advantages might be offset by larger nonresponse bias. This article compares the data quality in a web survey administration to another common mode of survey administration, the telephone.

The unique feature of this study is the availability of administrative records for all sampled individuals in combination with a random assignment of survey mode. This specific design allows us to investigate and compare potential bias in survey statistics due to 1) nonresponse error, 2) measurement error, and 3) combined bias of these two error sources and hence, an overall assessment of data quality for two common modes of survey administration, telephone and web.

Our results show that overall mean estimates on the web are more biased compared to the telephone mode. Nonresponse and measurement bias tend to reinforce each other in both modes, with nonresponse bias being somewhat more pronounced in the web mode. While measurement error bias tends to be smaller in the web survey implementation, interestingly, our results also show that the web does not consistently outperform the telephone mode for sensitive questions.

If the inline PDF is not rendering correctly, you can download the PDF file here.

  • AAPOR The American Association for Public Opinion Research. 2011. Standard Definitions: Final Dispositions of Case Codes and Outcome Rates for Surveys. 7th edition.

  • Abraham K.G. A. Maitland and S.M. Bianchi. 2006. “Nonresponse in the American Time Use Survey. Who is Missing from the Data and How Much Does it Matter?” Public Opinion Quarterly 70: 676–703. Doi: http://dx.doi.org/10.1093/poq/nfl037.

  • Atkeson L.R. A.N. Adams and M.R. Alvarez. 2014. “Nonresponse and Mode Effects in Self- and Interviewer-Administered Surveys.” Political Analysis 22: 304–320. Doi: http://dx.doi.org/10.1093/pan/mpt049.

  • Atkeson L.R. A.N. Adams L.A. Bryant L. Zilberman and K.L. Saunders. 2011. “Considering Mixed Mode Surveys for Questions in Political Behavior: Using the Internet and Mail to Get Quality Data at Reasonable Costs.” Political Behavior 33: 161–178. Doi: http://dx.doi.org/10.1007/s11109-010-9121-1.

  • Bethlehem J. 2010. “Selection Bias in Web Surveys.” International Statistical Review 78: 161–188. Doi: http://dx.doi.org/10.1111/j.1751-5823.2010.00112.x.

  • Biemer P.P. 2010. “Overview of Design Issues: Total Survey Error.” In Handbook of Survey Research edited by P.V. Marsden and J.D. Wright 27–57. Bingley: Emerald.

  • Biemer P.P. and L.E. Lyberg. 2003. Introduction to Survey Quality. New York: Wiley.

  • Bound J. and A.B. Krueger. 1991. “The Extent of Measurement Error in Longitudinal Earnings Data: Do Two Wrongs Make a Right?” Journal of Labor Economics 9: 1–24. Doi: http://dx.doi.org/10.3386/w2885.

  • Bradburn N. S. Sudman and B. Wansink. 2004. Asking Questions. Revised Edition. San Francisco: Jossey-Bass.

  • Braunsberger K. H. Wybenga and R. Gates. 2007. “A Comparison of Reliability Between Telephone and Web-Based Surveys.” Journal of Business Research 60: 758–764. Doi: http://dx.doi.org/10.1016/j.jbusres.2007.02.015.

  • Callegaro M. R.P. Baker J. Bethlehem A.S. Göritz J.A. Krosnick and P.J. Lavrakas. 2014. Online Panel Research. A Data Quality Perspective. Chichester: Wiley.

  • Chang L. and J.A. Krosnick. 2009. “National Surveys via RDD Telephone Interviewing Versus the Internet. Comparing Sample Representativeness and Response Quality.” Public Opinion Quarterly 73: 641–678. Doi: http://dx.doi.org/10.1093/poq/nfp075.

  • Chang L. and J.A. Krosnick. 2010. “Comparing Oral Interviewing With Self-Administered Computerized Questions: An Experiment.” Public Opinion Quarterly 74: 154–167. Doi: http://dx.doi.org/10.1093/poq/nfp090.

  • De Leeuw E.D. 2005. “To Mix or Not to Mix Data Collection Modes in Surveys.” Journal of Official Statistics 21: 233–255.

  • De Leeuw E.D. D.A. Dillman and J.J. Hox. 2008. “Mixed-Mode Surveys: When and Why.” In International Handbook of Survey Methodology edited by E.D. de Leeuw J.J. Hox and D.A. Dillman 299–316. New York: Erlbaum/Taylor & Francis.

  • De Rada V.D. and S.P. del Amo. 2014. “Two Are Better Than One: The Use of a Mixed-Mode Data Collection to Improve the Electoral Forecast.” Survey Practice 7: 1–6. Doi: http://dx.doi.org/10.29115/SP-2014-0003.

  • Dillman D.A. J.L. Eltinge R.M. Groves and R.J.A. Little. 2002. “Survey Nonresponse in Design Data Collection and Analysis.” In Survey Nonresponse edited by R.M. Groves D.A. Dillman J.L. Eltinge and R.J.A. Little 3–26. New York: Wiley.

  • Dillman D.A. G. Phelps R. Tortora K. Swift J. Kohrell J. Berck and B.L. Messer. 2009. “Response Rate and Measurement Differences in Mixed-Mode Surveys Using Mail Telephone Interactive Voice Response (IVR) and the Internet.” Social Science Research 38: 1–18. Doi: http://dx.doi.org/10.1016/j.ssresearch.2008.03.007.

  • Duffy B. K. Smith G. Terhanian and J. Bremer. 2005. “Comparing Data from Online and Face-to-Face Surveys.” International Journal of Market Research 47: 615–639. Doi: http://doi.org/10.1177/147078530504700602.

  • Duncan G. and D. Hill. 1985. “An Investigation of the Extent and Consequences of Measurement Error in Labor-Economic Survey Data.” Journal of Labor Economics 3: 508–532. Doi: http://dx.doi.org/10.1086/298067.

  • Eckman S. F. Kreuter A. Kirchner A. Jäckle S. Presser and R. Tourangeau. 2014. “Assessing the Mechanisms of Misreporting to Filter Questions in Surveys.” Public Opinion Quarterly 78: 721–733. Doi: http://dx.doi.org/10.1093/poq/nfu030.

  • Fricker S. M. Galesic R. Tourangeau and T. Yan. 2005. “An Experimental Comparison of Web and Telephone Surveys.” Public Opinion Quarterly 6: 370–392. Doi: http://dx.doi.org/10.1093/poq/nfi027.

  • Groves R.M. 2004. Survey Error and Survey Costs. Hoboken: Wiley & Sons.

  • Groves R.M. 2006. “Nonresponse Rates and Nonresponse Bias in Household Surveys.” Public Opinion Quarterly 70: 646–675. Doi: http://dx.doi.org/10.1093/poq/nfl033.

  • Groves R.M. and M. Couper. 1998. Nonresponse in Household Interview Surveys. Wiley Series in Probability and Statistics: Survey Methodology Section. New York: Wiley.

  • Hope S. P. Campanelli G. Nicolaas P. Lynn and A. Jäckle. 2014. “The Role of the Interviewer in Producing Mode Effects: Results from a Mixed Modes Experiment Comparing Face-to-Face Telephone and Web Administration.” ISER Working Paper Series No. 2014-20: 1–41. Available at: http://hdl.handle.net/10419/123808 (accessed December 2014).

  • IAB (Institut für Arbeitsmarkt- und Berufsforschung). 2011. Nuremberg: Integrierte Erwerbsbiographien (IEB) V09.00.

  • IAB (Institut für Arbeitsmarkt- und Berufsforschung). 2012. Nuremberg: Leistungshistorik Grundsicherung (LHG) Version 06.06.

  • IAB (Institut für Arbeitsmarkt- und Berufsforschung). 2013. Nuremberg: Beschäftigtenhistorik (BeH) Version 09.03.00.

  • Jacobebbinghaus P. and S. Seth. 2007. “The German Integrated Employment Biographies Sample IEBS.” Schmollers Jahrbuch 127: 335–342.

  • Kreuter F. and K. Olson. 2011. “Multiple Auxiliary Variables in Nonresponse Adjustment.” Sociological Methods & Research 40: 311–332. Doi: http://dx.doi.org/10.1177/0049124111400042.

  • Kreuter F. K. Olson J. Wagner T. Yan T. Ezatti-Rice C. Casas-Cordero A. Petychev R. M. Groves and T. Raghuatan. 2010. “Using Proxy Measures and Other Correlates of Survey Outcomes to Adjust for Non-Response: Examples from Multiple Surveys.” Journal of the Royal Statistical Society: Series A (Statistics in Society) 173: 389–407. Doi: http://dx.doi.org/10.1111/j.1467-985X.2009.00621.x.

  • Kreuter F. S. Presser and R. Tourangeau. 2008. “Social Desirability Bias in CATI IVR and Web Surveys. The Effects of Mode and Question Sensitivity.” Public Opinion Quarterly 72: 847–865. Doi: http://dx.doi.org/10.1093/poq/nfn063.

  • Lee R.M. 1993. Doing Research on Sensitive Topics. London: Sage.

  • Letourneau P.M. and A.A. Zbikowski. 2008. “Nonresponse in the American Time Use Survey.” In Proceedings of the Section on Survey Research Methods: American Statistical Association August 4 2008. 1283–1290. Denver CO: American Statistical Association. Available at: http://ww2.amstat.org/sections/srms/Proceedings/y2008/Files/300982.pdf (accessed April 2018).

  • Lozar Manfreda K. M. Bosnjak J. Berzelak I. Haas and V. Vehovar. 2008. “Web Surveys Versus Other Survey Modes. A Meta-Analysis Comparing Response Rates.” International Journal of Market Research 50: 79–104. Doi: http://dx.doi.org/10.1177/147078530805000107.

  • Malhotra N. J.M. Miller and J. Wedeking. 2014. “The Relationship Between Nonresponse Strategies and Measurement Error. Comparing Online Panels to Traditional Surveys.” In Online Panel Research. A Data Quality Perspective edited by M. Callegaro R. Baker J. Bethlehem A.S. Göritz J. Krosnick and P.J. Lavrakas 313–336. Chichester: Wiley.

  • McCabe S.E. C.J. Boyd M.P. Couper S. Crawford and H. D’Arcy. 2002. “Mode Effects for Collecting Alcohol and Other Drug Use Data: Web and U.S. Mail.” Journal of Studies on Alcohol 63: 755–761. Doi: http://dx.doi.org/10.15288/jsa.2002.63.755.

  • Olson K. 2013. “Do Non-Response Follow-Ups Improve or Reduce Data Quality? A Review of the Existing Literature.” Journal of the Royal Statistical Society Series A (Statistics in Society) 176: 129–145. Doi: http://dx.doi.org/10.1111/j.1467-985X.2012.01042.x.

  • O’Neill G. and J. Dixon. 2005. “Nonresponse Bias in the American Time Use Survey.” In Proceedings of the Section on Survey Research Methods: American Statistical Association August 10 2005. 2958–2966. Minneapolis MN: American Statistical Association. Available at: http://ww2.amstat.org/sections/srms/Proceedings/y2005/Files/JSM2005-000193.pdf (accessed April 2018).

  • Roberts C. N. Allum and P. Sturgis. 2014. “Nonresponse and Measurement Error in an Online Panel. Does Additional Effort to Recruit Reluctant Respondents Result in Poorer Data Quality?” In Online Panel Research. A Data Quality Perspective edited by M. Callegaro R. Baker J. Bethlehem A.S. Göritz J. Krosnick and P.J. Lavrakas 337–362. Chichester: Wiley.

  • Rodgers W.L. C. Brown and G.J. Duncan. 1993. “Errors in Survey Reports of Earnings Hours Worked and Hourly Wages.” Journal of the American Statistical Association 88: 1208–1218. Doi: http://dx.doi.org/10.1080/01621459.1993.10476400.

  • Sakshaug J.W. and F. Kreuter. 2012. “Assessing the Magnitude of Non-Consent Biases in Linked Survey and Administrative Data.” Survey Research Methods 6: 113–122. Doi: http://dx.doi.org/10.18148/srm/2012.v6i2.5094.

  • Sakshaug J.W. T. Yan and R. Tourangeau. 2010. “Nonresponse Error Measurement Error and Mode of Data Collection: Tradeoffs in a Multi-Mode Survey of Sensitive and Non-Sensitive Items.” Public Opinion Quarterly 74: 907–933. Doi: http://dx.doi.org/10.1093/poq/nfq057.

  • Sanders D. H.D. Clarke M.C. Stewart and P. Whiteley. 2007. “Does Mode Matter for Modelling Political Choice? Evidence from the 2005 British Election Study.” Political Analysis 15: 257–285. Doi: http://dx.doi.org/10.1093/pan/mpl010.

  • Sax L.J. S.K. Gilmartin and A.N. Bryant. 2003. “Assessing Response Rates and Nonresponse Bias in Web and Paper Surveys.” Research in Higher Education 44: 409–432. Doi: http://dx.doi.org/10.1023/A:1024232915870.

  • Schouten B. F. Cobben P. Lundquist and J. Wagner. 2016. “Does More Balanced Survey Response Imply Less Non-Response Bias?” Journal of the Royal Statistical Society: Series A (Statistics in Society) 179: 727–748. Doi: http://dx.doi.org/10.1111/rssa.12152.

  • Statistisches Bundesamt. 2013. Wirtschaftsrechnungen. Private Haushalte in der Informationsgesellschaft – Nutzung von Informations – und Kommunikationstechnologien. Wiesbaden Germany: Statistisches Bundesamt.

  • Stephenson L.B. and J. Cre^te. 2011. “Studying Political Behavior: A Comparison of Internet and Telephone Surveys.” International Journal of Public Opinion Research 23: 24–55. Doi: http://dx.doi.org/10.1093/ijpor/edq025.

  • Vannieuwenhuyze J. G. Loosveldt and G. Molenberghs. 2010. “A Method for Evaluating Mode Effects in Mixed-Mode Surveys.” Public Opinion Quarterly 74: 1027–1045. Doi: http://dx.doi.org/10.1093/poq/nfq059.

  • Yeager D.S. J.A. Krosnick L. Chang H.S. Javitz M.S. Levendusky A. Simpser and R. Wang. 2011. “Comparing the Accuracy of RDD Telephone Surveys and Internet Surveys Conducted with Probability and Non-Probability Samples.” Public Opinion Quarterly 75: 709–747. Doi: http://dx.doi.org/10.1093/poq/nfr020.

Search
Journal information
Impact Factor


IMPACT FACTOR 2018: 0,837
5-year IMPACT FACTOR: 0,934

CiteScore 2018: 1.04

SCImago Journal Rank (SJR) 2018: 0.963
Source Normalized Impact per Paper (SNIP) 2018: 1.020

Metrics
All Time Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 689 689 53
PDF Downloads 681 681 38