statcheck Runner-Up for Sentinel Award

Screen-Shot-2017-08-02-at-4.05.46-PM-2Publons announced the winner of the Sentinel Award for outstanding advocacy, innovation or contribution to scholarly peer review, and I am proud to announce that statcheck was crowned runner-up!

I am honored that the judges considered statcheck a useful contribution to the peer review system. In the end, one of the things I hope to achieve is that all Psychology journals will consider it standard practice to quickly “statcheck” a paper for statistical inconsistencies to avoid publishing them.

A very warm congratulations to the winner of the award: Irene Hames. Irene spent most of her career on improving the quality of peer review and it is great that her work is recognized in this way! Also congratulations to the rest of the Sentinel Award nominees: Retraction WatchAmerican Geophysical UnionORCiDF1000ResearchThe Committee on Publication Ethics (COPE)Kyle Martin and Gareth Fraser.

For more information about the award, the winner, and the finalists, see this page.

The Guardian’s Science Weekly Podcast feat. statcheck

This week the Guardian’s Science Weekly podcast focuses on statistical malpractice and fraud in science. We talk about the role of statcheck in detecting statistical inconsistencies, and discuss the causes and implications of seemingly innocent rounding errors.

This podcast also offers fascinating insights from consultant anaesthetist John Carlisle about the detection of data fabrication, and president of the Royal Statistical Society David Spiegelhalter about the dangers of statistical malpractice.

statcheck Shortlisted for Publon Sentinel Award!

Proud to announce that I’ve been shortlisted for the Publon Sentinel Award for my work on statcheck. The Sentinel Award is an award for outstanding advocacy, innovation or contribution to scholarly peer review.

At this point, statcheck is used in the peer review process of two major psychology journals (Psychological Science and the Journal for Experimental Social Psychology) and an increasing number of journals are recommending using statcheck on your own manuscript before submitting it.

For more information about the award and the other great candidates, see this page.

Peer_review_week_finalist

New Preprint: Data Sharing & Statistical Inconsistencies

We just published the preprint of our new study “Journal Data Sharing Policies and Statistical Reporting Inconsistencies in Psychology” at https://osf.io/preprints/psyarxiv/sgbta.

In this paper, we ran three independent studies to investigate if data sharing is related to fewer statistical inconsistencies in a paper. Overall, we found no relationship between data sharing and reporting inconsistencies. However, we did find that journal policies on data sharing are extremely effective in promoting data sharing (see the Figure below).

EffectivenessOpenDataPolicy

We argue that open data is essential in improving the quality of psychological science, and we discuss ways to detect and reduce reporting inconsistencies in the literature.

Awarded a Campbell Methods Grant

I am honored to announce that Joshua R. Polanin and I were awarded a $20,000 methods grant from the Campbell Collaboration for the project “Verifying the Accuracy of Statistical Significance Testing in Campbell Collaboration Systematic Reviews Through the Use of the R Package statcheck”.

The grant is part of the Campbell Collaboration’s program to supporting innovative methods development in order to improve the quality of systematic reviews. It is great that we (and statcheck!) can be a part of this effort.

For more information about the grant and the three other recipients, see their website here.

 

A Logo for statcheck!

I’m very happy to present statcheck’s very own logo! With over 6000 downloads & over 6000 visits to the web app, and a running pilot in Pyschological Science, I thought statcheck deserved one. Also, this might settle the upper case/lower case confusion once and for all: it’s not Statcheck, StatCheck, statCheck, or even STATCHECK, but:

statcheck-01.jpg

Leamer-Rosenthal Prize for statcheck

I’m honored and proud to announce that Sacha Epskamp and I won the 2016 Leamer-Rosenthal Prize for Open Social Science! This award is an initiative of the Berkeley Initiative for Transparency in the Social Sciences (BITSS), and comes with a prize of
$10,000.

bitss-55a55026v1_site_icon.png

We are invited to attend the 2016 BITSS annual meeting to receive our prize, along with
seven other researchers and educators.

For more information about our nomination and the prize itself see: http://www.bitss.org/2016/12/02/sneak-peek-announcing-two-of-nine-winners-of-the-2016-leamer-rosenthal-prizes-for-open-social-science-michele-nuijten-and-sacha-epskamp/