References of "Assche, Jasper Van"
     in
Bookmark and Share    
Full Text
Peer Reviewed
See detailObserving many researchers using the same data and hypothesis reveals a hidden universe of uncertainty
Breznau, Nate; Rinke, Eike Mark; Wuttke, Alexander et al

in Proceedings of the National Academy of Sciences (2022), 119(44), 2203150119

This study explores how researchers’ analytical choices affect the reliability of scientific findings. Most discussions of reliability problems in science focus on systematic biases. We broaden the lens ... [more ▼]

This study explores how researchers’ analytical choices affect the reliability of scientific findings. Most discussions of reliability problems in science focus on systematic biases. We broaden the lens to emphasize the idiosyncrasy of conscious and unconscious decisions that researchers make during data analysis. We coordinated 161 researchers in 73 research teams and observed their research decisions as they used the same data to independently test the same prominent social science hypothesis: that greater immigration reduces support for social policies among the public. In this typical case of social science research, research teams reported both widely diverging numerical findings and substantive conclusions despite identical start conditions. Researchers’ expertise, prior beliefs, and expectations barely predict the wide variation in research outcomes. More than 95 % of the total variance in numerical results remains unexplained even after qualitative coding of all identifiable decisions in each team’s workflow. This reveals a universe of uncertainty that remains hidden when considering a single study in isolation. The idiosyncratic nature of how researchers’ results and conclusions varied is a previously underappreciated explanation for why many scientific hypotheses remain contested. These results call for greater epistemic humility and clarity in reporting scientific findings. [less ▲]

Detailed reference viewed: 196 (121 UL)
See detailHow Many Replicators Does It Take to Achieve Reliability? Investigating Researcher Variability in a Crowdsourced Replication
Breznau, Nate; Rinke, Eike Mark; Wuttke, Alexander et al

E-print/Working paper (2021)

The paper reports findings from a crowdsourced replication. Eighty-four replicator teams attempted to verify results reported in an original study by running the same models with the same data. The ... [more ▼]

The paper reports findings from a crowdsourced replication. Eighty-four replicator teams attempted to verify results reported in an original study by running the same models with the same data. The replication involved an experimental condition. A “transparent” group received the original study and code, and an “opaque” group received the same underlying study but with only a methods section and description of the regression coefficients without size or significance, and no code. The transparent group mostly verified the original study (95.5%), while the opaque group had less success (89.4%). Qualitative investigation of the replicators’ workflows reveals many causes of non-verification. Two categories of these causes are hypothesized, routine and non-routine. After correcting non-routine errors in the research process to ensure that the results reflect a level of quality that should be present in ‘real-world’ research, the rate of verification was 96.1 in the transparent group and 92.4 in the opaque group. Two conclusions follow: (1) Although high, the verification rate suggests that it would take a minimum of three replicators per study to achieve replication reliability of at least 95 confidence assuming ecological validity in this controlled setting, and (2) like any type of scientific research, replication is prone to errors that derive from routine and undeliberate actions in the research process. The latter suggests that idiosyncratic researcher variability might provide a key to understanding part of the “reliability crisis” in social and behavioral science and is a reminder of the importance of transparent and well documented workflows. [less ▲]

Detailed reference viewed: 81 (5 UL)
See detailObserving Many Researchers Using the Same Data and Hypothesis Reveals a Hidden Universe of Uncertainty
Breznau, Nate; Rinke, Eike Mark; Wuttke, Alexander et al

E-print/Working paper (2021)

How does noise generated by researcher decisions undermine the credibility of science? We test this by observing all decisions made among 73 research teams as they independently conduct studies on the ... [more ▼]

How does noise generated by researcher decisions undermine the credibility of science? We test this by observing all decisions made among 73 research teams as they independently conduct studies on the same hypothesis with identical starting data. We find excessive variation of outcomes. When combined, the 107 observed research decisions taken across teams explained at most 2.6 of the total variance in effect sizes and 10 of the deviance in subjective conclusions. Expertise, prior beliefs and attitudes of the researchers explain even less. Each model deployed to test the hypothesis was unique, which highlights a vast universe of research design variability that is normally hidden from view and suggests humility when presenting and interpreting scientific findings. [less ▲]

Detailed reference viewed: 122 (11 UL)