Friday, October 9, 2015

9/10/15: Is Economics Research Replicable?… err… ”Usually Not”


An interesting, albeit limited by the size of the sample, paper on replicability of research findings in Economics (link here).

The authors took 67 papers published in 13 “well-regarded economics journals” and attempted to replicate the papers’ reported findings. The researchers asked authors of the papers and journals for original data and codes used in preparing the paper (in some top Economics journals, it is a normal practice to require co-disclosure of data and empirical models estimation codes alongside publication of the paper).

“Aside from 6 papers that use confidential data, we obtain data and code replication files for 29 of 35 papers (83%) that are required to provide such files as a condition of publication, compared to 11 of 26 papers (42%) that are not required to provide data and code replication files.”

Here is the top line conclusion from the study: “We successfully replicate the key qualitative result of 22 of 67 papers (33%) without contacting the authors. Excluding the 6 papers that use confidential data and the 2 papers that use software we do not possess, we replicate 29 of 59 papers (49%) with assistance from the authors.”

In other words, even the authors of the original papers themselves were not able to put the results to re-test.

“Because we are able to replicate less than half of the papers in our sample even with help from the authors, we assert that economics research is usually not replicable.”

This is hardly new, as noted by the study authors. “Despite our finding that economics research is usually not replicable, our replication success rates are still notably higher than those reported by existing studies of replication in economics. McCullough, McGeary, and Harrison (2006) find a replication success rate for articles published in the JMCB of 14 of 186 papers (8%), conditioned on the replicators’ access to appropriate software, the original article’s use of non-proprietary data, and without assistance from the original article’s authors. Adding a requirement that the JMCB archive contain data and code replication files the paper increases their success rate to 14 of 62 papers (23%). Our comparable success rates are 22 of 59 papers (37%), conditioned on our having appropriate software and non-proprietary data, and 22 of 38 papers (58%) when we impose the additional requirement of having data and code files. Dewald, Thursby, and Anderson (1986) successfully replicate 7 of 54 papers (13%) from the JMCB, conditioned on the replicators having data and code files, the original article’s use of non-confidential data, help from the original article’s authors, and appropriate software. Our comparable figure is 29 of 38 papers (76%).”

A handy summary of results:














So in basic terms, economists are not only pretty darn useless in achieving forecasting accuracy (which we know and don’t really care about for the reasons too hefty to explain here), but we are pretty darn useless at achieving replicable results of our own empirical studies using the same data. Hmmm…

No comments:

Post a Comment