In this paper we use the automated procedure “statcheck” to extract over 250.000 p-values from 30.000 psychology articles and check whether they are consistent.
We find that half of the articles contain at least one inconsistency, and 1 in 8 articles contains a gross inconsistency that affects the statistical conclusion. The prevalence of inconsistencies seems to be stable over time.
The article is Open Access and available here: http://link.springer.com/article/10.3758/s13428-015-0664-2
I am nominated for the VIVA400 for my research about replication. The VIVA400 is a ranking of 400 Dutch women that have achieved something remarkable this year in the categories creativity, science, sports, or environment.
You could vote for me (or one of the other amazing candidates!) here: http://www.viva400.nl/knappe-koppen/michele-nuijten/
Watch a 5 minute interview (in Dutch) in which I explain when replication research is and isn’t useful.
This interview was originally published on the website of Tilburg University:https://www.tilburguniversity.edu/nl/over/schools/socialsciences/nieuws/michele-nuijten-over-de-replication-paradox/
In this manuscript we use the R package statcheck (by Sacha Epskamp & me) to examine the prevalence of statistical reporting errors in 8 major psychology journals from 1985 to 2013. We find that half of all articles contains at least one inconsistency and 1 in 8 articles contains a grossly inconsistent p-value that could have changed the statistical conclusion. We find no evidence that the prevalence of inconsistencies is increasing over the years.
You can find the post-print here: PDF
My research and opinions about replications were featured in a three page article in the Dutch national news. Read the full article by Maarten Keulemans here (in Dutch).
Photographs by Adrie Mouthaan.
My latest paper “The replication paradox: Combining studies can decrease accuracy of effect size estimates” is now published in Review of General Psychology. You can find a postprint of the paper here. The full reference is:
Nuijten, M. B., Van Assen, M. A. L. M., Veldkamp, C. L. S., & Wicherts, J. M. (2015).The replication paradox: Combining studies can decrease accuracy of effect size estimates. Review of General Psychology, 19(2), 172-182. http://dx.doi.org/10.1037/gpr0000034
In the latest edition of “De Psychonoom”, the magazine of the Dutch Association for Psychonomy, I talk about the ‘replication paradox’ and explain why replications do not necessarily decrease bias in effect size estimates. Read the interview here (in Dutch).
Statcheck is an R package that Sacha Epskamp and I wrote together. Statcheck extracts statistics and recomputes p-values. Extremely handy to check your own papers for accidental slip-ups in the result section, but it can also be used to estimate error prevalence across a wide range of scientific articles. So far, statcheck can only read results that are reported exactly in APA style. Note that in order to scan PDF files, you need to have the program pdf-to-txt installed and on your path variable. Furthermore, if you want to run statcheck on a Mac, you need to install XQuartz. For more info, see the project page.
Will integrating original studies and published replications always improve the reliability of your results? No! Replication studies suffer from the same publication bias as original studies… Read the full blog here.