Search Results

1 - 10 of 583 items :

  • Mathematics x
Clear All
Enhancing Survey Quality: Continuous Data Processing Systems

Abstract

Producers of large government-sponsored surveys regularly use Computer-Assisted Interviewing (CAI) software to design data collection instruments, monitor fieldwork operations, and evaluate data quality. When used in conjunction with responsive survey designs, last-minute modifications to problems in the field are quickly addressed. Complementing this strategy, but little discussed, is the need to implement similar changes in the post data collection stage of the survey data life cycle. We describe a continuous data processing system where completed interviews are carefully examined as soon as they are collected; editing, recode, and imputation programs are applied using CAI tools; and the results are reviewed to correct problematic cases. The goal: provide higher quality data and shorten the time between the conclusion of data collection and the appearance of public use data files.

Open access
Accounting for Complex Sampling in Survey Estimation: A Review of Current Software Tools

Methods for Complex Sample Data: Logistic Regression and Discrete Proportional Hazards Models.” Communications in Statistics-Theory and Methods 14: 1377–1392. Doi: https://doi.org/10.1080/03610928508828982 . Chantala, K., D. Blanchette, and C.M. Suchindran. 2011. “Software to Compute Sampling Weights for Multilevel Analysis.” Technical Report, Carolina Population Center, UNC at Chapter Hill. Available at http://www.cpc.unc.edu/research/tools/data_analysis/ml_sampling_weights (accessed January 30, 2018). Claeskens, G. 2013. “Lack of Fit, Graphics, and

Open access
SELEKT – A Generic Tool for Selective Editing

References Adolfsson, C. and P. Gidlund. 2008. “Conducted Case Studies at Statistics Sweden.” Paper presented at the Work Session on Statistical Data Editing, Vienna, Austria, 21-23 April 2008. Available at: http://www.unece.org/fileadmin/DAM/stats/documents/2008/04/sde/wp.32.e.pdf (accessed February 2016). Brinkley, E., K. Farwell, and F. Yu. 2011. “Selective Editing Methods and Tools: An Australian Bureau of Statistics Perspective.” In Proceedings of Statistics Canada Symposium 2011. Available at: http

Open access
Data mining as a tool in privacy-preserving data publishing

ABSTRACT

Many databases contain data about individuals that are valuable for research, marketing, and decision making. Sharing or publishing data about individuals is however prone to privacy attacks, breaches, and disclosures. The concern here is about individuals’ privacy-keeping the sensitive information about individuals private to them. Data mining in this setting has been shown to be a powerful tool to breach privacy and make disclosures. In contrast, data mining can be also used in practice to aid data owners in their decision on how to share and publish their databases. We present and discuss the role and uses of data mining in these scenarios and also briefly discuss other approaches to private data analysis.

Open access
Analytic Tools for Evaluating Variability of Standard Errors in Large-Scale Establishment Surveys

Abstract

Large-scale establishment surveys often exhibit substantial temporal or cross-sectional variability in their published standard errors. This article uses a framework defined by survey generalized variance functions to develop three sets of analytic tools for the evaluation of these patterns of variability. These tools are for (1) identification of predictor variables that explain some of the observed temporal and cross-sectional variability in published standard errors; (2) evaluation of the proportion of variability attributable to the abovementioned predictors, equation error and estimation error, respectively; and (3) comparison of equation error variances across groups defined by observable predictor variables. The primary ideas are motivated and illustrated by an application to the U.S. Current Employment Statistics program.

Open access
Loop profiling tool for HPC code inspection as an efficient method of FPGA based acceleration

Loop profiling tool for HPC code inspection as an efficient method of FPGA based acceleration

This paper presents research on FPGA based acceleration of HPC applications. The most important goal is to extract a code that can be sped up. A major drawback is the lack of a tool which could do it. HPC applications usually consist of a huge amount of a complex source code. This is one of the reasons why the process of acceleration should be as automated as possible. Another reason is to make use of HLLs (High Level Languages) such as Mitrion-C (Mohl, 2006). HLLs were invented to make the development of HPRC applications faster. Loop profiling is one of the steps to check if the insertion of an HLL to an existing HPC source code is possible to gain acceleration of these applications. Hence the most important step to achieve acceleration is to extract the most time consuming code and data dependency, which makes the code easier to be pipelined and parallelized. Data dependency also gives information on how to implement algorithms in an FPGA circuit with minimal initialization of it during the execution of algorithms.

Open access
Local Correlation and Entropy Maps as Tools for Detecting Defects in Industrial Images

Local Correlation and Entropy Maps as Tools for Detecting Defects in Industrial Images

The aim of this paper is to propose two methods of detecting defects in industrial products by an analysis of gray level images with low contrast between the defects and their background. An additional difficulty is the high nonuniformity of the background in different parts of the same image. The first method is based on correlating subimages with a nondefective reference subimage and searching for pixels with low correlation. To speed up calculations, correlations are replaced by a map of locally computed inner products. The second approach does not require a reference subimage and is based on estimating local entropies and searching for areas with maximum entropy. A nonparametric estimator of local entropy is also proposed, together with its realization as a bank of RBF neural networks. The performance of both methods is illustrated with an industrial image.

Open access
Control Error Dynamic Modification as an Efficient Tool for Reduction of Effects Introduced by Actuator Constraints

Control Error Dynamic Modification as an Efficient Tool for Reduction of Effects Introduced by Actuator Constraints

A modification of digital controller algorithms, based on the introduction of a virtual reference value, which never exceeds active constraints in the actuator output is presented and investigated for some algorithms used in single-loop control systems. This idea, derived from virtual modification of a control error, can be used in digital control systems subjected to both magnitude and rate constraints. The modification is introduced in the form of on-line adaptation to the control task. Hence the design of optimal (in a specified sense) digital controller parameters can be separated from actuator constraints. The adaptation of the control algorithm (to actuator constraints) is performed by the transformation of the control error and is equivalent to the introduction of a new, virtual reference value for the control system. An application of this approach is presented through examples of three digital control algorithms: the PID algorithm, the dead-beat controller and the state space controller. In all cases, clear advantages of transients are observed, which yields some general conclusions to the problem of processing actuator constraints in control.

Open access
A modified filter SQP method as a tool for optimal control of nonlinear systems with spatio-temporal dynamics

A modified filter SQP method as a tool for optimal control of nonlinear systems with spatio-temporal dynamics

Our aim is to adapt Fletcher's filter approach to solve optimal control problems for systems described by nonlinear Partial Differential Equations (PDEs) with state constraints. To this end, we propose a number of modifications of the filter approach, which are well suited for our purposes. Then, we discuss possible ways of cooperation between the filter method and a PDE solver, and one of them is selected and tested.

Open access
Univariate Tests for Phase Capacity: Tools for Identifying When to Modify a Survey’s Data Collection Protocol

. “Regression Analysis for Sample Survey.” Sankhyā 37, Series C, Pt. 3: 117–132. Graham, J., A. Olchowski, and T. Gilreath. 2007. “How Many Imputations Are Really Needed? Some Practical Clarifications of Multiple Imputation Theory.” Prevention Science 8: 206–213. Doi: http://dx.doi.org/10.1007/s11121-007-0070-9 . Groves, R. and S. Heeringa. 2006. “Responsive Design for Household Surveys: Tools for Actively Controlling Survey Errors and Costs.” Journal of the Royal Statistics Society: Series A (Statistics in Society) 169: 439–457. Doi: http://dx.doi.org/10

Open access