Does the Length of Fielding Period Matter? Examining Response Scores of Early Versus Late Responders

Open access


This article discusses the potential effects of a shortened fielding period on an employee survey’s item and index scores and respondent demographics. Using data from the U.S. Office of Personnel Management’s 2011 Federal Employee Viewpoint Survey, we investigate whether early responding employees differ from later responding employees. Specifically, we examine differences in item and index scores related to employee engagement and global satisfaction. Our findings show that early responders tend to be less positive, even after adjusting their weights for nonresponse. Agencies vary in their prevalence of late responders, and score differences become magnified as this proportion increases. We also examine the extent to which early versus late responders differ on demographic characteristics such as grade level, supervisory status, gender, tenure with agency, and intention to leave, noting that nonminorities and females are the two demographic characteristics most associated with responding early.

If the inline PDF is not rendering correctly, you can download the PDF file here.

  • American Association for Public Opinion Research. 2009. Standard Definitions: Final Dispositions of Case Codes and Outcome Rates for Surveys. Sixth Edition. Available at: (accessed January 27 2014).

  • Baruch Y. and B.C. Holtom. 2008. “Survey Response Rate Levels and Trends in Organizational Research.” Human Relations 61: 1139-1160. DOI:

  • Bates N. and K. Creighton. 2000. “The Last Five Percent: What Can We Learn from Difficult/Late Interviews?” In Proceedings of the Section on Government Statistics: American Statistical Association August 13 2000. 120-125. Alexandria VA: American Statistical Association.

  • Baur E.J. 1947. “Response Bias in a Mail Survey.” Public Opinion Quarterly 11: 595-600. DOI: http:/

  • Borg I. and T. Tuten. 2003. “Early versus Later Respondents in Intranet-Based Organizational Surveys.” Journal of Behavioral and Applied Management 4: 134-145.

  • Carroll R.J. and D. Ruppert. 1988. Transformation and Weighting in Regression. New York NY: Chapman and Hall.

  • Dillman D.A. J.D. Smyth and L.M. Christian. 2009. Internet Mail and Mixed-Mode Surveys: The Tailored Design Method 3rd ed. Hoboken NJ: Wiley.

  • Filion F. 1975. “Estimating Bias Due to Nonresponse in a Mail Survey.” Public Opinion Quarterly 39: 482-492. DOI:

  • Gannon M.J. J.C. Nothern and S.J. Carroll. 1971. “Characteristics of Nonrespondents Among Workers.” Journal of Applied Psychology 55: 586-588. DOI:

  • Green K.E. 1991. “Reluctant Respondents: Differences Between Early Late and Nonresponders to a Mail Survey.” The Journal of Experimental Education 59: 268-276. DOI:

  • Groves R.M. and M. Couper. 1998. Nonresponse in Household Interview Surveys. New York NY: Wiley. DOI:

  • Groves R.M. E. Singer and A. Corning. 2000. “Leverage-Saliency Theory of Survey Participation: Description and an Illustration.” Public Opinion Quarterly 64: 299-308. DOI:

  • De Leeuw E. and W. de Heer. 2002. “Trends in Household Survey Nonresponse: a Longitudinal and International Comparison.” In Survey Nonresponse edited by RobertM. Groves et al. New York NY: Wiley. DOI:

  • Ellis R.A. C.M. Endo and J.M. Armer. 1970. “The Use of Potential Nonrespondents for Studying Nonresponse Bias.” Pacific Sociological Review 13: 103-109. DOI:

  • Erickson T.J. 2005. “The 21st Century Workplace: Preparing for Tomorrow’s Employment Trends Today.” (Testimony submitted before the U.S. Senate Committee on Health Education Labor and Pensions May 26 2005). Available at:¼2005&month¼05 (accessed December 2012).

  • Jacoby J. and M.S. Matell. 1971. “Three-Point Likert Scales are Good Enough.” Journal of Marketing Research 8: 495-500. DOI:

  • Kalton G.F. and I. Flores-Cervantes. 2003. “Weighting Methods.” Journal of Official Statistics 19: 81-97.

  • Kraut A.I. 1996. Organizational Surveys: Tools for Assessment and Change. San Francisco CA: Jossey-Bass.

  • Kreuter F. 2013. Improving Surveys with Paradata: Analytic Uses of Process Information. Hoboken NJ: Wiley. DOI:

  • Little R.J.A. and D.B. Rubin. 2002. Statistical Analysis with Missing Data. Second ed. New York NY: Wiley. DOI:

  • Macey W.H. and B. Schneider. 2008. “The Meaning of Employee Engagement.” Industrial and Organizational Psychology 1: 3-30. DOI:

  • Mayer C.S. and R.W. Pratt. Jr. 1966. “A Note on Nonresponse in a Mail Survey.” Public Opinion Quarterly 30: 637-646. DOI:

  • Newman S.W. 1962. “Differences between Early and Late Respondents to a Mailed Survey.” Advertising Research 2: 37-39.

  • Pace R.C. 1939. “Factors Influencing Questionnaire Returns from Former University Students.” Journal of Applied Psychology 23: 388-397. DOI:

  • Rogelberg S.G. and J.M. Stanton. 2007. “Understanding and Dealing with Organizational Survey Nonresponse.” Organizational Research Methods 10: 195-209. DOI:

  • Schwirian K.P. and H.R. Blaine. 1966. “Questionnaire-Return Bias in the Study of Blue- Collar Workers.” Public Opinion Quarterly 30: 656-663. DOI:

  • Sonquist J.A. E.L. Baker and J.N. Morgan. 1974. Searching for Structure. Ann Arbor MI: Institute for Social Research University of Michigan.

  • U.S. Office of Personnel Management. 2011. 2011 Federal Employee Viewpoint Survey: Governmentwide Management Report. Washington DC: OPM. Available at: http:// (accessed January 27 2014).

  • Wagner J. 2013. “Adaptive Contact Strategies in Telephone and Face-to-Face Surveys.” Survey Research Methods 7: 45-55. DOI:

  • Weeks M.F. 1987. “Optimal Call Scheduling for a Telephone Survey.” Public Opinion Quarterly 51: 540-549. DOI:

Journal information
Impact Factor

IMPACT FACTOR 2018: 0.837
5-year IMPACT FACTOR: 0.934

CiteScore 2018: 1.04

SCImago Journal Rank (SJR) 2018: 0.963
Source Normalized Impact per Paper (SNIP) 2018: 1.020

Cited By
All Time Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 408 254 3
PDF Downloads 148 114 1