Search Results

You are looking at 1 - 10 of 38 items for :

  • Probability and Statistics x
Clear All
Open access

Jörg Drechsler, Gerd Ronning and Philipp Bleninger

. (1967). A Comparison of Four Methods for Constructing Factor Scores. Psychometrika, 32, 381-401. DOI: http://www.dx.doi.org/10.1007/ BF02289653 O’Keefe, C., Sparks, R., McAullay, D., and Loong, B. (2012). Confidentialising Survival Analysis Output in a Remote Data Access System. Journal of Privacy and Confidentiality 4. Available at: http://repository.cmu.edu/jpc/vol4/iss1/6 (accessed January 17, 2014). O’Keefe, C.M. and Good, N.M. (2008). A Remote Analysis Server - What Does Regression Output Look Like? In Privacy in Statistical

Open access

Marco Di Zio and Ugo Guarnera

., and Scholtus, S. (2011). Handbook of Statistical Data Editing and Imputation. New York: John Wiley and Sons. Ghosh-Dastidar, B. and Schafer, J.L. (2006). Outlier Detection and Editing Procedures for Continuous Multivariate Data. Journal of Official Statistics, 22, 487-506. Granquist, L. (1997). The New View on Editing. International Statistical Review, 65, 381-387. Hedlin, D. (2003). Score Functions to Reduce Business Survey Editing at the U.K. Office for National Statistics. Journal of Official Statistics, 19, 177

Open access

Anders Norberg

Editing.” Statistical Review 2: 105-118. Granquist, L. 1997. “The New View on Editing.” International Statistical Review 3: 381-387. Doi: http://dx.doi.org/10.2307/1403378. Granquist, L. and J. Kovar. 1997. “Editing of Survey Data: How Much Is Enough?” Survey Measurement and Process Quality, 415-435. Doi: http://dx.doi.org/10.1002/9781118490013.ch18. Hedlin, D. 2003. “Score Functions to Reduce Business Survey Editing at the UK Office for National Statistics.” Journal of Official Statistics 19: 177

Open access

Ton de Waal

). Three Eras of Survey Research. Public Opinion Quarterly, 75, 861-871. DOI: http://www.dx.doi.org/10.1093/poq/nfr057 Hedlin, D. (2003). Score Functions to Reduce Business Survey Editing at the U.K. Office for National Statistics. Journal of Official Statistics, 19, 177-199. Hedlin, D. (2008). Local and Global Score Functions in Selective Editing. UN/ECE Work Session on Statistical Data Editing, 21-23 April, Vienna. Hidiroglou, M.A. and Berthelot, J.-M. (1986). Statistical Editing and Imputation for Periodic Business

Open access

Richard Sigman, Taylor Lewis, Naomi Dyer Yount and Kimya Lee

Abstract

This article discusses the potential effects of a shortened fielding period on an employee survey’s item and index scores and respondent demographics. Using data from the U.S. Office of Personnel Management’s 2011 Federal Employee Viewpoint Survey, we investigate whether early responding employees differ from later responding employees. Specifically, we examine differences in item and index scores related to employee engagement and global satisfaction. Our findings show that early responders tend to be less positive, even after adjusting their weights for nonresponse. Agencies vary in their prevalence of late responders, and score differences become magnified as this proportion increases. We also examine the extent to which early versus late responders differ on demographic characteristics such as grade level, supervisory status, gender, tenure with agency, and intention to leave, noting that nonminorities and females are the two demographic characteristics most associated with responding early.

Open access

Bin Liu, Cindy Long Yu, Michael Joseph Price and Yan Jiang

7. References Ashmead, R. 2014. “Propensity Score Methods for Estimating Causal Effects from Complex Survey Data.” Ph.D. Dissertation, Ohio State University. Retrieved from http://rave.ohiolink.edu/etdc/view?acc_num=osu1417616653 . Berg, E., J.K. Kim, and C. Sinner. 2016. “Imputation under Informative Sampling.” Journal of Survey Statistics and Methodology 4: 436–462. Doi: 10.1093/jssam/smw032. Breidt, F.J., G. Claeskens, and J.D. Opsomer. 2005. “Model-Assisted Estimation for Complex Surveys Using Penalised Splines.” Biometrika 92(4): 831

Open access

David Haziza and Éric Lesage

Protection for Unit Nonresponse With a Nonlinear Calibration-Weighting Routine.” Survey Research Methods 6: 105-111. Lee, S. 2006. “Propensity Score Adjustments as a Weighting Scheme for Volunteer Panel Web Surveys.” Journal of Official Statistics 22: 329-349. Little, R.J.A. 1986. “Survey Nonresponse Adjustments for Estimates of Means.” International Statistical Review 54: 139-157. Little, R.J.A. and S. Vartivarian. 2005. “Does Weighting for Nonresponse Increase the Variance of Survey Means?” Survey Methodology 31: 161

Open access

Morgan Earp, Melissa Mitchell, Jaki McCarthy and Frauke Kreuter

://dx.doi.org/10.1214/11-AOAS521. Powers, R., J. Eltinge, and M. Cho. 2006. “Evaluation of the Detectability and Inferential Impact of Nonresponse Bias in Establishment Surveys.” In Proceedings of the Joint Statistical Meetings: American Statistical Association. Alexandria, VA: American Statistical Association. Available at: http://www.bls.gov/ore/pdf/st060130.pdf (accessed August 2014). Rosenbaum, P. and D. Rubin. 1983. “The Central Role of the Propensity Score in Observational Studies for Causal Effects.” Biometrika 70: 41-55. DOI: http

Open access

Aida Calviño

Abstract

In this article we propose a simple and versatile method for limiting disclosure in continuous microdata based on Principal Component Analysis (PCA). Instead of perturbing the original variables, we propose to alter the principal components, as they contain the same information but are uncorrelated, which permits working on each component separately, reducing processing times. The number and weight of the perturbed components determine the level of protection and distortion of the masked data. The method provides preservation of the mean vector and the variance-covariance matrix. Furthermore, depending on the technique chosen to perturb the principal components, the proposed method can provide masked, hybrid or fully synthetic data sets. Some examples of application and comparison with other methods previously proposed in the literature (in terms of disclosure risk and data utility) are also included.

Open access

Alexis Dewaele, Maya Caen and Ann Buysse

, Bisexual and Transgender Populations, I.H. Meyer and M.E. Northridge (eds). New York: Springer, 441-454. Schillewaert, N. and Meulemeester, P. (2005). Comparing Response Distributions of Offline and Online Data Collection Methods. International Journal of Market Research, 47, 163-178. Schonlau, M., van Soest, A., Kapteyn, A., and Couper, M. (2009). Selection Bias in Web-Surveys and the Use of Propensity Scores. Sociological Methods and Research, 37, 291-318. Schwarcz, S., Spindler, H., Scheer, S., Valleroy, L., and Lansky