This article reports on a study testing the effects of different ways of administering an opt-out consent for record linkage in a probability-based Internet panel. First, we conducted cognitive interviews to explore reactions to a draft version of the opt-out consent text. Second, we conducted a two-factor experiment to test the effects of content manipulations and mode. The results indicate that the way in which respondents were informed did not have much effect on opting out. Results from a follow-up survey on attitudes regarding privacy, confidentiality, and trust, along with knowledge questions about the process of linking, showed no evidence that presenting the opt-out consent statement makes respondents more concerned about privacy. Knowledge about the aspects of record linkage is generally not high. When looking at long-term effects of sending an opt-out consent statement, we found no evidence that this leads to higher attrition or lower participation rates.
Since 1969, families participating in the U.S. Panel Study of Income Dynamics (PSID) have been sent a mailing asking them to update or verify their contact information in order to keep track of their whereabouts between waves. Having updated contact information prior to data collection is associated with fewer call attempts, less tracking, and lower attrition. Based on these advantages, two experiments were designed to increase response rates to the between wave contact mailing. The first experiment implemented a new protocol that increased the overall response rate by 7-10 percentage points compared to the protocol in place for decades on the PSID. This article provides results from the second experiment which examines the basic utility of the between-wave mailing, investigates how incentives affect article cooperation to the update request and field effort, and attempts to identify an optimal incentive amount. Recommendations for the use of contact update strategies in panel studies are made.
A source of survey processing error that has received insufficient study to date is the misclassification of open-ended responses. We report on efforts to understand the misclassification of open occupation descriptions in the Current Population Survey (CPS). We analyzed double-coded CPS descriptions to identify which features vary with intercoder reliability. One factor strongly related to reliability was the length of the occupation description: longer descriptions were less reliably coded than shorter ones. This effect was stronger for particular occupation terms. We then carried out an experiment to examine the joint effects of description length and classification “difficulty” of particular occupation terms. For easy occupation terms longer descriptions were less reliably coded, but for difficult occupation terms longer descriptions were slightly more reliably coded than short descriptions. Finally, we observed as coders provided verbal reports on their decision making. One practice, evident in coders’ verbal reports, is their use of informal coding rules based on superficial features of the description. Such rules are likely to promote reliability, though not necessarily validity, of coding. To the extent that coders use informal rules for long descriptions involving difficult terms, this could help explain the observed relationship between description length and difficulty of coding particular terms.