Dynamic Question Ordering in Online Surveys

Open access

Abstract

Online surveys have the potential to support adaptive questions, where later questions depend on earlier responses. Past work has taken a rule-based approach, uniformly across all respondents. We envision a richer interpretation of adaptive questions, which we call Dynamic Question Ordering (DQO), where question order is personalized. Such an approach could increase engagement, and therefore response rate, as well as imputation quality. We present a DQO framework to improve survey completion and imputation. In the general survey-taking setting, we want to maximize survey completion, and so we focus on ordering questions to engage the respondent and collect hopefully all information, or at least the information that most characterizes the respondent, for accurate imputations. In another scenario, our goal is to provide a personalized prediction. Since it is possible to give reasonable predictions with only a subset of questions, we are not concerned with motivating users to answer all questions. Instead, we want to order questions to get information that reduces prediction uncertainty, while not being too burdensome. We illustrate this framework with two case studies, for the prediction and survey-taking settings. We also discuss DQO for national surveys and consider connections between our statistics-based question-ordering approach and cognitive survey methodology.

Achtyes, E.D., S. Halstead, L. Smart, T. Moore, E. Frank, D.J. Kupfer, and R. Gibbons. 2015. “Validation of Computerized Adaptive Testing in an Outpatient Nonacademic Setting: The VOCATIONS Trial.” Psychiatric Services 66(10): 1091–1096. Doi: http://dx.doi.org/10.1176/appi.ps.201400390.

Almirall, D., S.N. Compton, M. Gunlicks-Stoessel, N. Duan, and S.A. Murphy. 2012. “Designing a Pilot Sequential Multiple Assignment Randomized Trial for Developing an Adaptive Treatment Strategy.” Statistics in Medicine 31(17): 1887–1902. Doi: http://dx.doi.org/10.1002/sim.4512.

Angelovska, J. and P.M. Mavrikiou. 2013. Can Creative Web Survey Questionnaire Design Improve the Response Quality? University of Amsterdam, AIAS Working Paper, 131. Available at: http://archive.uva-aias.net/uploadedfiles/publications/AIASWP131-1.pdf (accessed April 14, 2017).

Barge, S. and H. Gehlbach. 2012. “Using the Theory of Satisficing to Evaluate the Quality of Survey Data.” Research in Higher Education 53(2): 182–200. Doi: http://dx.doi.org/10.1007/s11162-011-9251-2.

Benedetto, G., M. Stinson, and J.M. Abowd. 2013. The Creation and Use of the SIPP Synthetic Beta. U.S. Census Bureau. Available at: http://www.census.gov/content/dam/Census/programs-surveys/sipp/methodology/SSBdescribenontechnical.pdf (accessed April 14, 2017).

Bouamrane, M.-M., A. Rector, and M. Hurrell. 2008. Gathering Precise Patient Medical History with an Ontology-Driven Adaptive Questionnaire. In Proceedings of the 21st IEEE International Symposium on Computer-Based Medical Systems, 539–541, 17–19 June 2008, Jyväskylä, Finland. Doi: http://dx.doi.org/10.1109/CBMS.2008.24.

Bradburn, N. 1978. “Respondent Burden.” In Proceedings of the American Statistical Association, Survey Research Methods Section, 35–40. Available at: http://ww2.amstat.org/sections/SRMS/Proceedings/papers/1978_007.pdf (accessed April 14, 2017).

Bureau of Justice Statistics. 2014. National Crime and Victimization Survey: Technical Documentation. Available at: http://www.bjs.gov/content/pub/pdf/ncvstd13.pdf (accessed April 14, 2017).

Clark, S.L. 2014. American Community Survey Item Nonresponse Rates: Mail versus internet. American Community Survey Research and Evaluation Program (March). Available at: https://www.census.gov/content/dam/Census/library/working-papers/2014/acs/2014Clark01.pdf (accessed April 14, 2017).

Cohn, D.A., Z. Ghahramani, and M.I. Jordan. 1996. “Active Learning with Statistical Models.” Journal of Artificial Intelligence Research 4: 129–145.

Collins, L.M., S.A. Murphy, V.N. Nair, N. Vijay, and V.J. Strecher. 2005. “A Strategy for Optimizing and Evaluating Behavioral Interventions.” Annals of Behavioral Medicine 30(1): 65–73. Doi: http://dx.doi.org/10.1207/s15324796abm3001_8.

Collins, L.M., S.A. Murphy, and V. Strecher. 2007. “The Multiphase Optimization Strategy (MOST) and the Sequential Multiple Assignment Randomized Trial (SMART): New Methods for More Potent eHealth Interventions.” American Journal of Preventive Medicine 32(5): S112–S118. Doi: http://dx.doi.org/10.1016/j.amepre.2007.01.022.

Cook, C., F. Heath, and R.L. Thompson. 2000. “A Meta-Analysis of Response Rates in Web- or Internet-Based Surveys.” Educational and Psychological Measurement 60(6): 821–836. Doi: http://dx.doi.org/10.1177/00131640021970934.

Couper, M.P., G.L. Alexander, N. Zhang, R.J.A. Little, N. Maddy, M.A. Nowak, J.B. McClure, J.J. Calvi, S.J. Rolnick, M.A. Stopponi, and C.C. Johnson. 2010. “Engagement and Retention: Measuring Breadth and Depth of Participant Use of an Online Intervention.” Journal of Medical Internet Research 12(4): e52. Doi: http://dx.doi.org/10.2196/jmir.1430.

Cover, T.M. and P.E. Hart. 1967. “Nearest Neighbor Pattern Classification.” IEEE Transactions on Information Theory 13(1): 21–27. Doi: http://dx.doi.org/10.1109/tit.1967.1053964.

Cover, T.M. and J.A. Thomas. 2012. Elements of Information Theory. John Wiley & Sons. Doi: http://dx.doi.org/10.1002/047174882x.

Dietz, T., G.T. Gardner, J. Gilligan, P.C. Stern, and M.P. Vandenbergh. 2009. “Household Actions Can Provide a Behavioral Wedge to Rapidly Reduce U.S. Carbon Emissions.” In Proceedings of the National Academy of Sciences, November 3, 2009. 106: 18452–18456. National Academy of Sciences. Doi: http://dx.doi.org/10.1073/pnas.0908738106.

Early, K., S. Fienberg, and J. Mankoff. 2016. “Test-Time Feature Ordering with FOCUS: Interactive Predictions with Minimal User Burden.” In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing, September 12–16, Heidelberg, Germany, 992–1003. Doi: http://dx.doi.org/10.1145/2971648.2971748.

Ferrucci, D., A. Levas, S. Bagchi, D. Gondek, and E.T. Mueller. 2013. “Watson: Beyond Jeopardy!” Artificial Intelligence 199: 93–105. Doi: http://dx.doi.org/10.1016/j.artint.2012.06.009.

Fliege, H., J. Becker, O.B. Walter, J.B. Bjorner, B.F. Klapp, and M. Rose. 2005. “Development of a Computer-Adaptive Test for Depression (D-CAT).” Quality of Life Research 14(10): 2277–2291. Doi: http://dx.doi.org/10.1007/s11136-005-6651-9.

Fricker, S., T. Yan, and S. Tsai. 2014. “Response Burden: What Predicts it and who is Burdened Out.” In Proceedings of the American Association for Public Opinion Research, May 15–18, 2014, Anaheim, California. 4568–4577. Available at: http://ww2.amstat.org/sections/SRMS/Proceedings/y2014/Files/400298500838.pdf (accessed April 14, 2017).

Fuller, W.A. 2009. Measurement Error Models. New York: Wiley. Doi: http://dx.doi.org/10.1002/9780470316665.

Galesic, M. 2006. “Dropouts on the Eeb: Effects of Interest and Burden Experienced During an Online Survey.” Journal of Official Statistics 22(2): 313–328.

Galesic, M. and M. Bosnjak. 2009. “Effects of Questionnaire Length on Participation and Indicators of Response Quality in a Web Survey.” Public Opinion Quarterly 73(2): 349–360. Doi: http://dx.doi.org/10.1093/poq/nfp031.

Gardner, W., K. Shear, K.J. Kelleher, K.A. Pajer, O. Mammen, D. Buysse, and E. Frank. 2004. “Computerized Adaptive Measurement of Depression: A Simulation Study.” BMC Psychiatry 4(1): 13–23. Doi: http://dx.doi.org/10.1186/1471-244x-4-13.

Gibbons, R.D., D.J. Weiss, E. Frank, and D.J. Kupfer. 2016. “Computerized Adaptive Diagnosis and Testing of Mental Health Disorders.” Annual Review of Clinical Psychology 12(1): 83–104. Doi: http://dx.doi.org/10.1146/annurev-clinpsy-021815-093634.

Gibbons, R.D., D.J. Weiss, D.J. Kupfer, E. Frank, A. Fagiolini, V.J. Grochocinski, and J.C. Immekus. 2008. “Using Computerized Adaptive Testing to Reduce the Burden of Mental Health Assessment.” Psychiatric Services 59(4): 361–368. Doi: http://dx.doi.org/10.1176/ps.2008.59.4.361.

Gibbons, R.D., D.J. Weiss, P.A. Pilkonis, E. Frank, T. Moore, J.B. Kim, and D.J. Kupfer. 2012. “Development of a Computerized Adaptive Test for Depression.” Archives of General Psychiatry 69(11): 1104–1112. Doi: http://dx.doi.org/10.1001/archgenpsychiatry.2012.14.

Griffin, D. and D. Nelson. 2014. “Reducing Respondent Burden in the ACS’s Computer Assisted Personal Visit Interviewing Operation - Phase 1 Results.” Available at: https://www.census.gov/content/dam/Census/library/working-papers/2014/acs/2014_Griffin_02.pdf (accessed April 14, 2017).

Groves, R.M. and S.G. Heeringa. 2006. “Responsive Design for Household Surveys: Tools for Actively Controlling Survey Errors and Costs.” Journal of the Royal Statistical Society: Series A (Statistics in Society) 169(3): 439–457. Doi: http://dx.doi.org/10.1111/j.1467-985x.2006.00423.x.

Harrell, F.E. 2001. Regression Modeling Strategies. New York: Springer. Doi: http://dx.doi.org/10.1007/978-1-4757-3462-1.

He, H., H. III Daume, and J. Eisner. 2012. “Cost-Sensitive Dynamic Feature Selection.” In Inferning 2012: ICML workshop on interaction between inference and learning. Edinburgh, Scotland. Available at: https://www.cs.jhu.edu/~jason/papers/he+al.icmlw12.pdf (accessed April 14, 2017).

Horwitz, R., J. Tancreto, M.F. Zelenak, and M. Davis. 2012. “Data Quality Assessment of the American Community Survey Internet Response Data.” Available at: https://www.census.gov/content/dam/Census/library/working-papers/2012/acs/2012.Horwitz.02.pdf (accessed April 14, 2017).

Jabine, T.B., M.L. Straf, J.M. Tanur, and R. Tourangeau. 1984. Cognitive Aspects of Survey Methodology: Building a Bridge Between Disciplines. Washington, DC: National Academies Press. Doi: http://dx.doi.org/10.2307/2289187.

Kaczmirek, L. 2008. “Human-Survey Interaction: Usability and Nonresponse in Online Surveys” (Doctoral dissertation, Universität Mannheim). Available at: http://ub-madoc.bib.uni-mannheim.de/2150/1/kaczmirek2008.pdf (accessed April 14, 2017).

Kamakura, W.A. and S.K. Balasubramanian. 1989. “Tailored Interviewing: An Application of Item Response Theory for Personality Measurement.” Journal of Personality Assessment 53(3): 502–519. Doi: http://dx.doi.org/10.1207/s15327752jpa5303.8.

Kapelner, A. and D. Chandler. 2010. “Preventing Satisficing in Online Surveys.” In Proceedings of CrowdConf. San Francsico, California. Available at: http://s3.amazonaws.com/academia.edu.documents/30740949/kapcha.pdf?AWSAccessKeyId=AKIAIWOWYYGZ2Y53UL3A&Expires=1499626168&Signature=0zH5TV1LDugz3h9FlSRJy5lRp38%3D&response-content-disposition=inline%3B%20filename%3DPreventing_Satisficing_in_Online_Surveys.pdf (accessed July 9, 2017).

Karayev, S., T. Baumgartner, M. Fritz, and T. Darrell. 2012. “Timely Object Recognition.” In Advances in Neural Information Processing Systems 25: 890–898. Lake Tahoe, Nevada. Available at: https://papers.nips.cc/paper/4712-timely-object-recognition.pdf (accessed April 14, 2017).

Kilbourne, A.M., D. Almirall, D. Eisenberg, J. Waxmonsky, D.E. Goodrich, J.C. Fortney, J.E. Kirchner, L.I. Solberg, D. Main, M.S. Bauer, J. Kyle, S.A. Murphy, K.M. Nord, and M.R. Thomas. 2014. “Protocol: Adaptive Implementation of Effective Programs Trial (ADEPT): Cluster Randomized SMART Trial Comparing a Standard Versus Enhanced Implementation Strategy to Improve Outcomes of a Mood Disorders Program.” Implementation Science 9(132): 1–14. Doi: http://dx.doi.org/10.1186/s13012-014-0132-x.

Laurie, H., and L. Scott. 1999. “Strategies for Reducing Nonresponse in a Longitudinal Panel Survey.” Journal of Official Statistics 15(2): 269–282.

Lavori, P.W., R. Dawson, and A.J. Rush. 2000. “Flexible Treatment Strategies in Chronic Disease: Clinical and Research Implications.” Biological Psychiatry 48(6): 605–614. Doi: http://dx.doi.org/10.1016/s0006-3223(00)00946-x.

Lord, F.M. 1980. Applications of Item Response Theory to Practical Testing Problems. Hillsdale, NJ: Lawrence Erlbaum Associates. Doi: http://dx.doi.org/10.4324/9780203056615.

Manfreda, K.L. and V. Vehovar. 2002. “Survey Design Features Influencing Response Rates in Web Surveys.” In The International Conference on Improving Surveys. Copenhagen, Denmark. Available at: http://www.websm.org/uploadi/editor/Lozar_Vehovar_2001_Survey_design.pdf (accessed April 14, 2017).

Marcus, B., M. Bosnjak, S. Lindner, S. Pilischenko, and A. Schütz. 2007. “Compensating for Low Topic Interest and Long Surveys a Field Experiment on Nonresponse in Web Surveys.” Social Science Computer Review 25(3): 372–383. Doi: http://dx.doi.org/10.1177/0894439307297606.

McFarland, S.G. 1981. “Effects of Question Order on Survey Responses.” Public Opinion Quarterly 45(2): 208–215. Doi: http://dx.doi.org/10.1086/268651.

Montgomery, J.M., and J. Cutler. 2013. “Computerized Adaptive Testing for Public Opinion Surveys.” Political Analysis 21(2): 172–192. Doi: http://dx.doi.org/10.1093/pan/mps060.

Murphy, S.A. 2005. “An Experimental Design for the Development of Adaptive Treatment Strategies.” Statistics in Medicine 24: 1455–1481. Doi: http://dx.doi.org/10.1002/sim.2022.

National Center for Health Statistics. 2014. “Survey Description: National Health Interview Survey.” Available at: http://ftp.cdc.gov/pub/Health_Statistics/NCHs/Dataset_Documentation/NHIS/2014/srvydesc.pdf (accessed April 14, 2017).

Pandey, D., M. Agrawal, and J.S. Pandey. 2011. “Carbon Footprint: Current Methods of Estimation.” Environmental Monitoring and Assessment 178(1–4): 135–160. Doi: http://dx.doi.org/10.1007/s10661-010-1678-y.

Peytchev, A. 2009. “Survey Breakoff.” Public Opinion Quarterly 73(1): 74–97. Doi: http://dx.doi.org/10.1093/poq/nfp014.

Pitkow, J.E. and M.M. Recker. 1995. “Using the Web as a Survey Tool: Results from the Second WWW User Survey.” Computer Networks and ISDN Systems 27(6): 809–822.

Porter, S.R. 2004. “Raising Response Rates: What Works?” New Directions for Institutional Research 2004(121): 5–21. Doi: http://dx.doi.org/10.1002/ir.97.

Rubin, D.B. 2004. Multiple Imputation for Nonresponse in Surveys (Vol. 81). John Wiley & Sons. Doi: http://dx.doi.org/10.1002/9780470316696.

Rush, A.J., M. Fava, S.R. Wisniewski, P.W. Lavori, M.H. Trivedi, H.A. Sackeim, M.E. Thase, A.A. Nierenberg, F.M. Quitkin, T.M. Kashner, and D.J. Kupfher. 2004. “Sequenced Treatment Alternatives to Relieve Depression (STAR* D): Rationale and Design.” Contemporary Clinical Trials 25(1): 119–142. Doi: http://dx.doi.org/10.1016/S0197-2456(03)00112-0.

Schouten, B., M. Calinescu, and A. Luiten. 2013. “Optimizing Quality of Response Through Adaptive Survey Designs.” Survey Methodology 39(1): 29–58.

Shih, T.-H., and X. Fan. 2008. “Comparing Response Rates from Web and Mail Surveys: A Meta-Analysis.” Field Methods 20(3): 249–271. Doi: http://dx.doi.org/10.1177/1525822x08317085.

Singh, J., R.D. Howell, and G.K. Rhoads. 1990. “Adaptive Designs for Likert-Type Data: An Approach for Implementing Marketing Surveys.” Journal of Marketing Research 27(3): 304–321. Doi: http://dx.doi.org/10.2307/3172588.

Stroup, T.S., J.P. McEvoy, M.S. Swartz, M.J. Byerly, I.D. Glick, J.M. Canive, and . . . J.A. Lieberman. 2003. “The National Institute of Mental Health Clinical Antipsychotic Trials of Intervention Effectiveness (CATIE) Project: Schizophrenia Trial Design and Protocol Development.” Schizophrenia Bulletin 29(1): 15–31. Doi: http://dx.doi.org/10.1093/oxfordjournals.schbul.a006986.

Suchman, L. and B. Jordan. 1990. “Interactional Troubles in Face-to-Face Survey Interviews (with discussion).” Journal of the American Statistical Association 85(409): 232–253. Doi: http://dx.doi.org/10.1080/01621459.1990.10475331.

Sudman, S., N.M. Bradburn, and N. Schwarz. 1996. Thinking About Answers: The Application of Cognitive Processes to Survey Methodology. San Francisco: Jossey-Bass.

Sun, M., F. Li, J. Lee, K. Zhou, G. Lebanon, and H. Zha. 2013. “Learning Multiple-Question Decision Trees for Cold-Start Recommendation.” In Proceedings of the Sixth ACM International Conference on Web Search and Data Mining, 445–454. Rome, Italy: ACM. Doi: http://dx.doi.org/10.1145/2433396.2433451.

Tanur, J.M. 1992. Questions About Questions: Inquiries into the Cognitive Bases of Surveys. New York: Russell Sage Foundation. Doi: http://dx.doi.org/10.2307/2075046.

Tourangeau, R. 1984. “Cognitive Sciences and Survey Methods.” In Cognitive Aspects of Survey Methodology: Building a Bridge Between Disciplines, edited by T.B. Jabine, M.L. Straf, J.M. Tanur, and R. Tourangeau, 73–100. Washington, DC: National Academies Press.

Tropp, J.A. 2004. “Greed is Good: Algorithmic Results for Sparse Approximation.” IEEE Transactions on Information Theory 50(10): 2231–2242. Doi: http://dx.doi.org/10.1109/tit.2004.834793.

U.S. Bureau of the Census. 2006. Design and methodology: Current Population Survey. Available at: http://www.nber.org/cps/tp-66.pdf (accessed April 14, 2017).

U.S. Bureau of the Census. 2013. American Housing Survey for the United States. Washington, D.C: U.S. Government Printing Office.

U.S. Bureau of the Census. 2014a. American Community Survey: Design and methodology. Available at: http://www2.census.gov/programs-surveys/acs/methodology/designandmethodology/acs_design_methodology_report_2014.pdf (accessed April 14, 2017).

U.S. Bureau of the Census. 2014b. Survey of Income and Program Participation. Available at: www.census.gov/programs-surveys/sipp (accessed April 14, 2017).

U.S. Bureau of the Census. 2016. American Community Survey. Available at: https://www.census.gov/programs-surveys/acs/ (accessed April 14, 2017).

U.S. Energy Information Administration. 2009. Residential Energy Consumption Survey 2009. Available at: www.eia.gov/consumption/residential/data/2009/. (accessed April 14, 2017).

Walston, J.T., R.W. Lissitz, and L.M. Rudner. 2006. “The Influence of Web-Based Questionnaire Presentation Variations on Survey Cooperation and Perceptions of Survey Quality.” Journal of Official Statistics 22(2): 271–291.

Wang, L., A. Rotnitzky, X. Lin, R.E. Millikan, and P.F. Thall. 2012. “Evaluation of Viable Dynamic Treatment Regimes in a Sequentially Randomized Trial of Advanced Prostate Cancer.” Journal of the American Statistical Association 107(498): 493–508. Doi: http://dx.doi.org/10.1080/01621459.2011.641416.

Weisberg, S. 2014. Applied Linear Regression (4th ed.). New York: Wiley. Doi: http://dx.doi.org/10.1002/0471704091.

Weiss, D.J. 1982. “Improving Measurement Quality and Efficiency with Adaptive Testing.” Applied Psychological Measurement 6(4): 473–492. Doi: http://dx.doi.org/10.1177/014662168200600408.

Weiss, D.J. and G.G. Kingsbury. 1984. “Application of Computerized Adaptive Testing to Educational Eroblems.” Journal of Educational Measurement 21(4): 361–375. Doi: http://dx.doi.org/10.1111/j.1745-3984.1984.tb01040.x.

Yu, E.C., S. Fricker, and B. Kopp. 2015. “Can Survey Instructions Relieve Respondent Burden?” In AAPOR. Hollywood, Florida. Available at: http://www.bls.gov/osmr/pdf/st150260.pdf (accessed April 14, 2017).

Journal of Official Statistics

The Journal of Statistics Sweden

Journal Information


IMPACT FACTOR 2017: 0.662
5-year IMPACT FACTOR: 1.113

CiteScore 2017: 0.74

SCImago Journal Rank (SJR) 2017: 1.158
Source Normalized Impact per Paper (SNIP) 2017: 0.860

Metrics

All Time Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 330 330 40
PDF Downloads 179 179 34