Young eScientist Award

In December 2020, Willem Sleegers and I were awarded the Young eScientist Award from the Netherlands eScience Center for our proposal to improve statcheck’s searching algorithm. Today marks the start of our collaboration with the eScience Center and we are very excited to get started!

In this project, we plan to extend statcheck’s search algorithm with natural language processing algorithms, in order to recognize more statistics than just the ones reported perfectly in APA style (a current restriction). We hope that this extension will expand statcheck’s functionality beyond psychology, so that statistical errors in, e.g., biomedical and economics papers can also be detected and corrected.

More information about the award can be found here.

Image result for escience center netherlands

METAxDATA Meeting at QUEST, Berlin

Last month, the QUEST center in Berlin organized the first METAxDATA meeting on building automated screening tools for data-driven meta-research. On the first night of the meeting, 13 researchers gave lightning talks about their tools. The clip below features my <2 minute lightning talk about statcheck.

All lightning talks were recorded and can be found here.

New Preprint: statcheck’s Validity is High

In our new preprint we investigated the validity of statcheck. Our main conclusions were:

  • statcheck’s sensitivity, specificity, and overall accuracy are very high. The specific numbers depended on several choices & assumptions, but ranged from:
    • sensitivity: 85.3% – 100%
    • specificity: 96.0% – 100%
    • accuracy: 96.2% – 99.9%
  • The prevalence of statistical corrections (e.g., Bonferroni, or Greenhouse-Geisser) seems to be higher than we initially estimated
  • But: the presence of these corrections doesn’t explain the high prevalence of reporting inconsistencies in psychology

We conclude that statcheck’s validity is high enough to recommend it as a tool in peer review, self-checks, or meta-research.


statcheck Runner-Up for Sentinel Award

Screen-Shot-2017-08-02-at-4.05.46-PM-2Publons announced the winner of the Sentinel Award for outstanding advocacy, innovation or contribution to scholarly peer review, and I am proud to announce that statcheck was crowned runner-up!

I am honored that the judges considered statcheck a useful contribution to the peer review system. In the end, one of the things I hope to achieve is that all Psychology journals will consider it standard practice to quickly “statcheck” a paper for statistical inconsistencies to avoid publishing them.

A very warm congratulations to the winner of the award: Irene Hames. Irene spent most of her career on improving the quality of peer review and it is great that her work is recognized in this way! Also congratulations to the rest of the Sentinel Award nominees: Retraction WatchAmerican Geophysical UnionORCiDF1000ResearchThe Committee on Publication Ethics (COPE)Kyle Martin and Gareth Fraser.

For more information about the award, the winner, and the finalists, see this page.

The Guardian’s Science Weekly Podcast feat. statcheck

This week the Guardian’s Science Weekly podcast focuses on statistical malpractice and fraud in science. We talk about the role of statcheck in detecting statistical inconsistencies, and discuss the causes and implications of seemingly innocent rounding errors.

This podcast also offers fascinating insights from consultant anaesthetist John Carlisle about the detection of data fabrication, and president of the Royal Statistical Society David Spiegelhalter about the dangers of statistical malpractice.

statcheck Shortlisted for Publon Sentinel Award!

Proud to announce that I’ve been shortlisted for the Publon Sentinel Award for my work on statcheck. The Sentinel Award is an award for outstanding advocacy, innovation or contribution to scholarly peer review.

At this point, statcheck is used in the peer review process of two major psychology journals (Psychological Science and the Journal for Experimental Social Psychology) and an increasing number of journals are recommending using statcheck on your own manuscript before submitting it.

For more information about the award and the other great candidates, see this page.