References of "Social Science Computer Review"
     in
Bookmark and Share    
Full Text
Peer Reviewed
See detailEnhancing Participation in Probability-Based Online Panels: Two Incentive Experiments and Their Effects on Response and Panel Recruitment
Witte, Nils; Schaurer, Ines; Schröder, Jette et al

in Social Science Computer Review (2022)

This article investigates how mail-based online panel recruitment can be facilitated through incentives. The analysis relies on two incentive experiments and their effects on panel recruitment, and the ... [more ▼]

This article investigates how mail-based online panel recruitment can be facilitated through incentives. The analysis relies on two incentive experiments and their effects on panel recruitment, and the intermediate participation in the recruitment survey. The experiments were implemented in the context of the German Emigration and Remigration Panel Study and encompass two samples of randomly sampled persons. Tested incentives include a conditional lottery, conditional monetary incentives, and the combination of unconditional money-in-hand with conditional monetary incentives. For an encompassing evaluation of the link between incentives and panel recruitment, the article further assesses the incentives’ implications for demographic composition and panel recruitment unit costs. Multivariate analysis indicates that low combined incentives (€5/€5) or, where unconditional disbursement is unfeasible, high conditional incentives (€20) are most effective in enhancing panel participation. In terms of demographic bias, low combined incentives (€5/€5) and €10 conditional incentives are the favored options. The budget options from the perspective of panel recruitment include the lottery and the €10 conditional incentive which break-even at net sample sizes of 1000. [less ▲]

Detailed reference viewed: 23 (2 UL)
Full Text
Peer Reviewed
See detailThe Impact of Forced Answering and Reactance on Answering Behavior in Online Surveys
Sischka, Philipp UL; Décieux, Jean Philippe; Mergener, Alexandra et al

in Social Science Computer Review (2020)

Forced answering (FA) is a frequent answer format in online surveys that forces respondents to answer each question in order to proceed through the questionnaire. The underlying rationale is to decrease ... [more ▼]

Forced answering (FA) is a frequent answer format in online surveys that forces respondents to answer each question in order to proceed through the questionnaire. The underlying rationale is to decrease the amount of missing data. Despite its popularity, empirical research on the impact of FA on respondents’ answering behavior is scarce and has generated mixed findings. In fact, some quasi-experimental studies showed that FA has detrimental consequences such as increased survey dropout rates and faking behavior. Notably, a theoretical psychological process driving these effects has hitherto not been identified. Therefore, the aim of the present study was twofold: First, we sought to experimentally replicate detrimental effects of FA on online questionnaire data quality. Second, we tried to uncover an explanatory psychological mechanism. Specifically, we hypothesized that FA effects are mediated through reactance. Zero-order effects showed that FA increased state reactance and questionnaire dropout as well as reduced answer length in open-ended questions. Results of survival and mediation analyses corroborate negative FA effects on data quality and the proposed psychological process. [less ▲]

Detailed reference viewed: 202 (9 UL)