How to make replication the norm

Gertler et al in Nature:

Replication is essential for building confidence in research studies1, yet it is still the exception rather than the rule2,3. That is not necessarily because funding is unavailable — it is because the current system makes original authors and replicators antagonists. Focusing on the fields of economics, political science, sociology and psychology, in which ready access to raw data and software code are crucial to replication efforts, we survey deficiencies in the current system.

We propose reforms that can both encourage and reinforce better behaviour — a system in which authors feel that replication of software code is both probable and fair, and in which less time and effort is required for replication.

Current incentives for replication attempts reward those efforts that overturn the original results. In fact, in the 11 top-tier economics journals we surveyed, we could find only 11 replication studies — in this case, defined as reanalyses using the same data sets — published since 2011. All claimed to refute the original results. We also surveyed 88 editors and co-editors from these 11 journals. All editors who replied (35 in total, including at least one from each journal) said they would, in principle, publish a replication study that overturned the results of an original study. Only nine of the respondents said that they would consider publishing a replication study that confirmed the original results. We also personally experienced antagonism between replicators and authors in a programme sponsored by the International Initiative for Impact Evaluation (3ie), a non-governmental organization that actively funds software-code replication. We participated as authors of original studies (P.G. and S.G.) and as the chair of 3ie’s board of directors (P.G.). In our experience, the programme worked liked this: 3ie selected influential papers to be replicated and then held an open competition, awarding approximately US$25,000 for the replication of each study4. The organization also offered the original authors the opportunity to review and comment on the replications. Of 27 studies commissioned, 21 were completed, and 7 (33%) reported that they were unable to fully replicate the results in the original article. The only replication published in a peer-reviewed journal5 claimed to refute the results of the original paper.

Despite 3ie’s best efforts, adversarial relationships developed between original and replication researchers. Original authors of five studies wrote in public comments that the replications actively sought to refute their results and were nitpicking.

More here.