Combining Multiple Methods in Establishment Questionnaire Testing: The 2017 Census of Agriculture Testing Bento Box

Open access

Abstract

There are many methods that can be used to test questionnaires, each with its own strengths and weaknesses. The best approaches to questionnaire testing combine different methods to both broaden and strengthen the results. The US Census of Agriculture (COA) is conducted every five years and collects detailed information on agricultural production, inventories, practices, and operator demographics from agricultural establishments. Preceding each COA, evaluation and testing is done to test new items in the questionnaire and improve data quality for the subsequent COA. This article will describe how a multi-method approach, which we call Bento Box Testing, was applied to establishment questionnaire testing leading up to the 2017 COA. Testing included solicitation of expert opinion, historical data review, cognitive testing, a large scale field test, and qualitative follow-up interviews. The benefits of these testing methods, considerations for establishment survey testing, and how their results in combination provide a stronger evaluation are discussed.

Couper, M. 2008. Designing Effective Web Surveys. Cambridge: Cambridge University Press.

Creswell, J.W. 2014. A Concise Introduction to Mixed Methods Research. Thousand Oaks, CA: Sage Publications, Inc.

Dillman, D. and C. Redline. 2004. “Testing Paper Self-Administered Questionnaires: Cognitive Interview and Field Test Comparisons.” In Methods for Testing and Evaluating Survey Questionnaires, edited by S. Presser, J. Rothgeb, M. Couper, J. Lessler, E. Martin, J. Martin, and E. Singer, 299–317. New York: Wiley and Sons, Inc.

Edwards, W.S. and D. Cantor. 1991. “Toward a Response Model in Establishment Surveys.” In Measurement Errors in Surveys, edited by P. Biemer, R. Groves, L. Lyberg, N. Mathiowetz, and S. Sudman, 211–233. New York: Wiley and Sons, Inc.

Forsyth, B., J. Rothgeb, and G. Willis. 2004. “Does Pretesting Make a Difference? An Experimental Test.” In Methods for Testing and Evaluating Survey Questionnaires, edited by S. Presser, J. Rothgeb, M. Couper, J. Lessler, E. Martin, J. Martin, and E. Singer, 525–546, New York: Wiley and Sons, Inc.

Fowler, F. 2004. “The Case for More Split-Sample Experiment in Developing Survey Instruments.” In Methods for Testing and Evaluating Survey Questionnaires, edited by S. Presser, J. Rothgeb, M. Couper, J. Lessler, E. Martin, J. Martin, and E. Singer, 173–188. New York: Wiley and Sons, Inc.

Groves, R.M., F.J. Fowler Jr, M. Couper, J. Lepkowski, E. Singer, and R. Tourangeau. 2009. Survey Methodology. New York: Wiley & Sons, Inc.

Madans, J., K. Miller, A. Maitland, and G. Willis. 2011. Question Evaluation Methods: Contributing to the Science of Data Quality. New York: Wiley & Sons, Inc.

McCarthy, J. 2013. “Getting Your Money’s Worth: Targeting Resources to Make Cognitive Interviews Most Effective.” In Proceedings of the Section on Survey Research Methods: American Statistical Association. Alexandria, VA: American Statistical Association. Available at: http://ww2.amstat.org/sections/srms/Proceedings/ (accessed November 2017).

McCarthy, J. 2016. “Planning Your Multi-method Questionnaire Testing Bento Box: Examples from the 2017 Census of Agriculture Testing.” Paper presented at the 2016 International Conference on Questionnaire Design, Development, Evaluation, and Testing (QDET2). Miami, Florida. Available at: https://ww2.amstat.org/meetings/qdet2/OnlineProgram/Program.cfm (accessed November 2017).

McCarthy, J. 2017. “Multi-use Field Testing: Examples from the 2017 Census of Agriculture Dry Run.” In Proceedings of the Section on Survey Research Methods: American Statistical Association. Alexandria, VA: American Statistical Association. Available at: http://ww2.amstat.org/sections/srms/Proceedings/ (accessed November 2017).

McCarthy, J. and D. Buysse. 2010. “Bento Box Questionnaire Testing: Multi-Method Questionnaire Testing for the 2012 Census of Agriculture.” In Proceedings of the Section on Survey Research Methods: American Statistical Association. Alexandria, VA: American Statistical Association. Available at: http://ww2.amstat.org/sections/srms/Proceedings/ (accessed November 2017).

Moore, D., K. Ott, and A. Gertseva. 2016. “Developing and Evaluating a Short Form: Results and Recommendations from Tests of a Form Designed to Reduce Questionnaire Length.” In Proceedings of the Fifth International Conference of Establishment Surveys, June 20–23, 2016. Geneva, Switzerland: American Statistical Association.

Office of Management and Budget. 2016. Statistical Policy Working Paper 47: Evaluating Survey Questions: An Inventory of Methods. Washington, DC.

Olson, K. 2010. “An Examination of Questionnaire Evaluation by Expert Reviewers.” Field Methods 22(4): 295–318.

Ott, K., P. McGovern, and R. Sirkis. 2016. “Using Analysis of Field Test Results to Evaluate Questionnaire Performance.” In Proceedings of the Fifth International Conference of Establishment Surveys, June 20–23, 2016. Geneva, Switzerland: American Statistical Association.

Persson, A., A. Bjornram, E. Elvers, and E. Erikson. 2015. “A Strategy to Test Questionnaires at a National Statistical Office.” Statistical Journal of the IAOS 31: 297–304.

Phipps, P. S. Butani, and Y. Chun. 1995. “Research on Establishment Survey Questionnaire Design.” Journal of Business and Economic Statistics 7: 337–346.

Presser, S., and J. Blair. 1994. “Survey Pretesting: Do Different Methods Produce Different Results?” Sociological Methodology 24: 73–104.

Presser, S., J. Rothgeb, M. Couper, J. Lessler, E. Martin, J. Martin, and E. Singer. 2004. Methods for Testing and Evaluating Survey Questionnaires. New York: Wiley and Sons, Inc.

Ridolfo, H., V. Harris, J. McCarthy, D. Miller, N. Sedransk, and L. Young. 2016. “Developing and Testing New Survey Questions: The Example of New Questions on the Role of Women and New/Beginning Farm Operators.” In Proceedings of the Fifth International Conference of Establishment Surveys, June 20-23, 2016, Geneva, Switzerland: American Statistical Association.

Schaeffer, N.C. and J. Dykema. 2004. “A Multiple-Method Approach to Improving the Clarity of Closely Related Concepts.” In Methods for Testing and Evaluating Survey Questionnaires, edited by S. Presser, J. Rothgeb, M. Couper, J. Lessler, E. Martin, J. Martin, and E. Singer, 475–502. New York: Wiley and Sons, Inc.

Snijkers, G., G. Haraldsen, J. Jones, and D. Willimack. 2013. Designing and Conducting Business Surveys. New York: Wiley & Sons, Inc.

Stettler, K., D. Willimack, and A. Anderson. 2001. “Adapting Cognitive Interview Methodologies to Compensate for Unique Characteristics of Establishments.” In Proceedings of the Section on Survey Research Methods: American Statistical Association. Alexandria, VA: American Statistical Association. Available at: http://ww2.amstat.org/sections/srms/Proceedings/ (accessed November 2017).

Tuttle, A., R. Morrison, and D. Willimack. 2010. “From Start to Pilot: A Multi-method Approach to the Comprehensive Redesign of an Economic Survey Questionnaire.” Journal of Official Statistics 26(1): 87–103.

Willimack, D. 2013. “Methods for the Development, Testing, and Evaluation of Data Collection Instruments.” In Designing and Conducting Business Surveys, edited by Snijkers, G., G. Haraldsen, J. Jones, and D. Willimack, 253–301. New York: Wiley and Sons, Inc.

Willimack, D. and E. Nichols. 2010. “A Hybrid Response Process Model for Business Surveys.” Journal of Official Statistics 26(1): 3–24.

Willis, G.B. 2004. Cognitive Interviewing: A Tool for Improving Questionnaire Design. Thousand Oaks, CA: Sage Publications.

Willis, G.B. 2015. Analysis of the Cognitive Interview in Questionnaire Design. Oxford: Oxford University Press.

Yan, T., F. Kreuter, and R. Tourangeau. 2012. “Evaluating Survey Questions: A Comparison of Methods.” Journal of Official Statistics 2: 503–529.

Journal of Official Statistics

The Journal of Statistics Sweden

Journal Information


IMPACT FACTOR 2017: 0.662
5-year IMPACT FACTOR: 1.113

CiteScore 2016: 0.63

SCImago Journal Rank (SJR) 2016: 0.710
Source Normalized Impact per Paper (SNIP) 2016: 0.975

Metrics

All Time Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 101 101 74
PDF Downloads 32 32 23