Robert Trivers in Psychology Today:
I point out in The Folly of Fools that science is naturally self-correcting—it requires experiments, data gathering and modes of analysis to be fully explicit, the better to be replicated and thus verified or falsified—but where humans or social behavior are involved, the temptation for quick and illegitimate progress is accelerated by the apparent importance of the results and the difficulty of checking on their veracity. Recently cases of deliberate fraud have been uncovered in the study of primate cognition (Harvard), the health benefits of resveratrol (U Conn), and numerous social psychology findings (Tilburg U, Netherlands). I will devote some later blogs to other aspects of fraud in science but will begin here with a very clever analysis of statistical fraud and lack of data sharing in psychology papers published in the United States. This and related work suggest that the problem of fraud in science is much broader than the few cases of deliberate, large-scale fraud might suggest.
Wicherts and co-authors made use of a little noted feature of all papers published in the more than 50 journals of the American Psychological Association (APA)—the authors of these papers commit by contract to sharing their raw data with anyone who asks for it, in order to attempt replication. Yet earlier work by this same group showed that for 141 papers in four top APA journals, 73 percent of the scientists did not share data when asked to. Since, as they point out, statistical errors are known to be surprisingly common and accounts of statistical results sometimes inaccurate and scientists often motivated to make decisions during statistical analysis which are biased in their own preferred direction, they were naturally curious to see if there was any connection between failure to report data and evidence of statistical bias.
Here is where they got a dramatic result. They limited their research to two of the four journals whose scientists were slightly more likely to share data and most of whose studies were similar in having an experimental design. This gave them 49 papers. Again, the majority failed to share any data, instead behaving as a parody of academics.