Online surveys have the potential to support adaptive questions, where later questions depend on earlier responses. Past work has taken a rule-based approach, uniformly across all respondents. We envision a richer interpretation of adaptive questions, which we call Dynamic Question Ordering (DQO), where question order is personalized. Such an approach could increase engagement, and therefore response rate, as well as imputation quality. We present a DQO framework to improve survey completion and imputation. In the general survey-taking setting, we want to maximize survey completion, and so we focus on ordering questions to engage the respondent and collect hopefully all information, or at least the information that most characterizes the respondent, for accurate imputations. In another scenario, our goal is to provide a personalized prediction. Since it is possible to give reasonable predictions with only a subset of questions, we are not concerned with motivating users to answer all questions. Instead, we want to order questions to get information that reduces prediction uncertainty, while not being too burdensome. We illustrate this framework with two case studies, for the prediction and survey-taking settings. We also discuss DQO for national surveys and consider connections between our statistics-based question-ordering approach and cognitive survey methodology.
If the inline PDF is not rendering correctly, you can download the PDF file here.
Achtyes, E.D., S. Halstead, L. Smart, T. Moore, E. Frank, D.J. Kupfer, and R. Gibbons. 2015. “Validation of Computerized Adaptive Testing in an Outpatient Nonacademic Setting: The VOCATIONS Trial.” Psychiatric Services 66(10): 1091–1096. Doi: http://dx.doi.org/10.1176/appi.ps.201400390.
Almirall, D., S.N. Compton, M. Gunlicks-Stoessel, N. Duan, and S.A. Murphy. 2012. “Designing a Pilot Sequential Multiple Assignment Randomized Trial for Developing an Adaptive Treatment Strategy.” Statistics in Medicine 31(17): 1887–1902. Doi: http://dx.doi.org/10.1002/sim.4512.
Bouamrane, M.-M., A. Rector, and M. Hurrell. 2008. Gathering Precise Patient Medical History with an Ontology-Driven Adaptive Questionnaire. In Proceedings of the 21st IEEE International Symposium on Computer-Based Medical Systems, 539–541, 17–19 June 2008, Jyväskylä, Finland. Doi: http://dx.doi.org/10.1109/CBMS.2008.24.
Cohn, D.A., Z. Ghahramani, and M.I. Jordan. 1996. “Active Learning with Statistical Models.” Journal of Artificial Intelligence Research 4: 129–145.
Collins, L.M., S.A. Murphy, V.N. Nair, N. Vijay, and V.J. Strecher. 2005. “A Strategy for Optimizing and Evaluating Behavioral Interventions.” Annals of Behavioral Medicine 30(1): 65–73. Doi: http://dx.doi.org/10.1207/s15324796abm3001_8.
Collins, L.M., S.A. Murphy, and V. Strecher. 2007. “The Multiphase Optimization Strategy (MOST) and the Sequential Multiple Assignment Randomized Trial (SMART): New Methods for More Potent eHealth Interventions.” American Journal of Preventive Medicine 32(5): S112–S118. Doi: http://dx.doi.org/10.1016/j.amepre.2007.01.022.
Cook, C., F. Heath, and R.L. Thompson. 2000. “A Meta-Analysis of Response Rates in Web- or Internet-Based Surveys.” Educational and Psychological Measurement 60(6): 821–836. Doi: http://dx.doi.org/10.1177/00131640021970934.
Couper, M.P., G.L. Alexander, N. Zhang, R.J.A. Little, N. Maddy, M.A. Nowak, J.B. McClure, J.J. Calvi, S.J. Rolnick, M.A. Stopponi, and C.C. Johnson. 2010. “Engagement and Retention: Measuring Breadth and Depth of Participant Use of an Online Intervention.” Journal of Medical Internet Research 12(4): e52. Doi: http://dx.doi.org/10.2196/jmir.1430.
Dietz, T., G.T. Gardner, J. Gilligan, P.C. Stern, and M.P. Vandenbergh. 2009. “Household Actions Can Provide a Behavioral Wedge to Rapidly Reduce U.S. Carbon Emissions.” In Proceedings of the National Academy of Sciences, November 3, 2009. 106: 18452–18456. National Academy of Sciences. Doi: http://dx.doi.org/10.1073/pnas.0908738106.
Early, K., S. Fienberg, and J. Mankoff. 2016. “Test-Time Feature Ordering with FOCUS: Interactive Predictions with Minimal User Burden.” In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing, September 12–16, Heidelberg, Germany, 992–1003. Doi: http://dx.doi.org/10.1145/2971648.2971748.
Fliege, H., J. Becker, O.B. Walter, J.B. Bjorner, B.F. Klapp, and M. Rose. 2005. “Development of a Computer-Adaptive Test for Depression (D-CAT).” Quality of Life Research 14(10): 2277–2291. Doi: http://dx.doi.org/10.1007/s11136-005-6651-9.
Galesic, M. 2006. “Dropouts on the Eeb: Effects of Interest and Burden Experienced During an Online Survey.” Journal of Official Statistics 22(2): 313–328.
Galesic, M. and M. Bosnjak. 2009. “Effects of Questionnaire Length on Participation and Indicators of Response Quality in a Web Survey.” Public Opinion Quarterly 73(2): 349–360. Doi: http://dx.doi.org/10.1093/poq/nfp031.
Gardner, W., K. Shear, K.J. Kelleher, K.A. Pajer, O. Mammen, D. Buysse, and E. Frank. 2004. “Computerized Adaptive Measurement of Depression: A Simulation Study.” BMC Psychiatry 4(1): 13–23. Doi: http://dx.doi.org/10.1186/1471-244x-4-13.
Gibbons, R.D., D.J. Weiss, D.J. Kupfer, E. Frank, A. Fagiolini, V.J. Grochocinski, and J.C. Immekus. 2008. “Using Computerized Adaptive Testing to Reduce the Burden of Mental Health Assessment.” Psychiatric Services 59(4): 361–368. Doi: http://dx.doi.org/10.1176/ps.2008.59.4.361.
Gibbons, R.D., D.J. Weiss, P.A. Pilkonis, E. Frank, T. Moore, J.B. Kim, and D.J. Kupfer. 2012. “Development of a Computerized Adaptive Test for Depression.” Archives of General Psychiatry 69(11): 1104–1112. Doi: http://dx.doi.org/10.1001/archgenpsychiatry.2012.14.
Groves, R.M. and S.G. Heeringa. 2006. “Responsive Design for Household Surveys: Tools for Actively Controlling Survey Errors and Costs.” Journal of the Royal Statistical Society: Series A (Statistics in Society) 169(3): 439–457. Doi: http://dx.doi.org/10.1111/j.1467-985x.2006.00423.x.
He, H., H. III Daume, and J. Eisner. 2012. “Cost-Sensitive Dynamic Feature Selection.” In Inferning 2012: ICML workshop on interaction between inference and learning. Edinburgh, Scotland. Available at: https://www.cs.jhu.edu/~jason/papers/he+al.icmlw12.pdf (accessed April 14, 2017).
Jabine, T.B., M.L. Straf, J.M. Tanur, and R. Tourangeau. 1984. Cognitive Aspects of Survey Methodology: Building a Bridge Between Disciplines. Washington, DC: National Academies Press. Doi: http://dx.doi.org/10.2307/2289187.
Kamakura, W.A. and S.K. Balasubramanian. 1989. “Tailored Interviewing: An Application of Item Response Theory for Personality Measurement.” Journal of Personality Assessment 53(3): 502–519. Doi: http://dx.doi.org/10.1207/s15327752jpa5303.8.
Kilbourne, A.M., D. Almirall, D. Eisenberg, J. Waxmonsky, D.E. Goodrich, J.C. Fortney, J.E. Kirchner, L.I. Solberg, D. Main, M.S. Bauer, J. Kyle, S.A. Murphy, K.M. Nord, and M.R. Thomas. 2014. “Protocol: Adaptive Implementation of Effective Programs Trial (ADEPT): Cluster Randomized SMART Trial Comparing a Standard Versus Enhanced Implementation Strategy to Improve Outcomes of a Mood Disorders Program.” Implementation Science 9(132): 1–14. Doi: http://dx.doi.org/10.1186/s13012-014-0132-x.
Laurie, H., and L. Scott. 1999. “Strategies for Reducing Nonresponse in a Longitudinal Panel Survey.” Journal of Official Statistics 15(2): 269–282.
Marcus, B., M. Bosnjak, S. Lindner, S. Pilischenko, and A. Schütz. 2007. “Compensating for Low Topic Interest and Long Surveys a Field Experiment on Nonresponse in Web Surveys.” Social Science Computer Review 25(3): 372–383. Doi: http://dx.doi.org/10.1177/0894439307297606.
Singh, J., R.D. Howell, and G.K. Rhoads. 1990. “Adaptive Designs for Likert-Type Data: An Approach for Implementing Marketing Surveys.” Journal of Marketing Research 27(3): 304–321. Doi: http://dx.doi.org/10.2307/3172588.
Stroup, T.S., J.P. McEvoy, M.S. Swartz, M.J. Byerly, I.D. Glick, J.M. Canive, and . . . J.A. Lieberman. 2003. “The National Institute of Mental Health Clinical Antipsychotic Trials of Intervention Effectiveness (CATIE) Project: Schizophrenia Trial Design and Protocol Development.” Schizophrenia Bulletin 29(1): 15–31. Doi: http://dx.doi.org/10.1093/oxfordjournals.schbul.a006986.
Sudman, S., N.M. Bradburn, and N. Schwarz. 1996. Thinking About Answers: The Application of Cognitive Processes to Survey Methodology. San Francisco: Jossey-Bass.
Sun, M., F. Li, J. Lee, K. Zhou, G. Lebanon, and H. Zha. 2013. “Learning Multiple-Question Decision Trees for Cold-Start Recommendation.” In Proceedings of the Sixth ACM International Conference on Web Search and Data Mining, 445–454. Rome, Italy: ACM. Doi: http://dx.doi.org/10.1145/2433396.2433451.
Tanur, J.M. 1992. Questions About Questions: Inquiries into the Cognitive Bases of Surveys. New York: Russell Sage Foundation. Doi: http://dx.doi.org/10.2307/2075046.
Tourangeau, R. 1984. “Cognitive Sciences and Survey Methods.” In Cognitive Aspects of Survey Methodology: Building a Bridge Between Disciplines, edited by T.B. Jabine, M.L. Straf, J.M. Tanur, and R. Tourangeau, 73–100. Washington, DC: National Academies Press.
Walston, J.T., R.W. Lissitz, and L.M. Rudner. 2006. “The Influence of Web-Based Questionnaire Presentation Variations on Survey Cooperation and Perceptions of Survey Quality.” Journal of Official Statistics 22(2): 271–291.
Wang, L., A. Rotnitzky, X. Lin, R.E. Millikan, and P.F. Thall. 2012. “Evaluation of Viable Dynamic Treatment Regimes in a Sequentially Randomized Trial of Advanced Prostate Cancer.” Journal of the American Statistical Association 107(498): 493–508. Doi: http://dx.doi.org/10.1080/01621459.2011.641416.
Yu, E.C., S. Fricker, and B. Kopp. 2015. “Can Survey Instructions Relieve Respondent Burden?” In AAPOR. Hollywood, Florida. Available at: http://www.bls.gov/osmr/pdf/st150260.pdf (accessed April 14, 2017).