New Preprint: Using Statcheck in Peer Review May Reduce Errors

We investigated whether statistical reporting inconsistencies could be avoided if journals implement the tool statcheck in the peer review process.

In a preregistered study covering over 7000 articles, we compared the inconsistency rates between two journals that implemented statcheck in their peer review process (Psychological Science and Journal of Experimental and Social Psychology) with two matched control journals (Journal of Experimental Psychology: General and Journal of Personality and Social Psychology, respectively), before and after statcheck was implemented.

Preregistered multilevel logistic regression analyses showed that the decrease in both inconsistencies and decision inconsistencies around p = .05 is considerably steeper in statcheck journals than in control journals, offering support for the notion that statcheck can be a useful tool for journals to avoid statistical reporting inconsistencies in published articles.

You can find the full preprint here: https://psyarxiv.com/bxau9

Young eScientist Award

In December 2020, Willem Sleegers and I were awarded the Young eScientist Award from the Netherlands eScience Center for our proposal to improve statcheck’s searching algorithm. Today marks the start of our collaboration with the eScience Center and we are very excited to get started!

In this project, we plan to extend statcheck’s search algorithm with natural language processing algorithms, in order to recognize more statistics than just the ones reported perfectly in APA style (a current restriction). We hope that this extension will expand statcheck’s functionality beyond psychology, so that statistical errors in, e.g., biomedical and economics papers can also be detected and corrected.

More information about the award can be found here.

Image result for escience center netherlands

METAxDATA Meeting at QUEST, Berlin

Last month, the QUEST center in Berlin organized the first METAxDATA meeting on building automated screening tools for data-driven meta-research. On the first night of the meeting, 13 researchers gave lightning talks about their tools. The clip below features my <2 minute lightning talk about statcheck.

All lightning talks were recorded and can be found here.

New Preprint: statcheck’s Validity is High

In our new preprint we investigated the validity of statcheck. Our main conclusions were:

  • statcheck’s sensitivity, specificity, and overall accuracy are very high. The specific numbers depended on several choices & assumptions, but ranged from:
    • sensitivity: 85.3% – 100%
    • specificity: 96.0% – 100%
    • accuracy: 96.2% – 99.9%
  • The prevalence of statistical corrections (e.g., Bonferroni, or Greenhouse-Geisser) seems to be higher than we initially estimated
  • But: the presence of these corrections doesn’t explain the high prevalence of reporting inconsistencies in psychology

We conclude that statcheck’s validity is high enough to recommend it as a tool in peer review, self-checks, or meta-research.

statcheck-01