LSE Impact Blog on Data Sharing

I had the opportunity to write an LSE Impact Blog about findings from our most recent preprint. I argue that data sharing is vital for scientific progress, and that incentivizing data sharing might be a lot easier than it sounds.

Check out the full blog here.

UsPsa_bX

Advertisements

statcheck Shortlisted for Publon Sentinel Award!

Proud to announce that I’ve been shortlisted for the Publon Sentinel Award for my work on statcheck. The Sentinel Award is an award for outstanding advocacy, innovation or contribution to scholarly peer review.

At this point, statcheck is used in the peer review process of two major psychology journals (Psychological Science and the Journal for Experimental Social Psychology) and an increasing number of journals are recommending using statcheck on your own manuscript before submitting it.

For more information about the award and the other great candidates, see this page.

Peer_review_week_finalist

New Preprint: Data Sharing & Statistical Inconsistencies

We just published the preprint of our new study “Journal Data Sharing Policies and Statistical Reporting Inconsistencies in Psychology” at https://osf.io/preprints/psyarxiv/sgbta.

In this paper, we ran three independent studies to investigate if data sharing is related to fewer statistical inconsistencies in a paper. Overall, we found no relationship between data sharing and reporting inconsistencies. However, we did find that journal policies on data sharing are extremely effective in promoting data sharing (see the Figure below).

EffectivenessOpenDataPolicy

We argue that open data is essential in improving the quality of psychological science, and we discuss ways to detect and reduce reporting inconsistencies in the literature.

Awarded a Campbell Methods Grant

I am honored to announce that Joshua R. Polanin and I were awarded a $20,000 methods grant from the Campbell Collaboration for the project “Verifying the Accuracy of Statistical Significance Testing in Campbell Collaboration Systematic Reviews Through the Use of the R Package statcheck”.

The grant is part of the Campbell Collaboration’s program to supporting innovative methods development in order to improve the quality of systematic reviews. It is great that we (and statcheck!) can be a part of this effort.

For more information about the grant and the three other recipients, see their website here.

 

A Logo for statcheck!

I’m very happy to present statcheck’s very own logo! With over 6000 downloads & over 6000 visits to the web app, and a running pilot in Pyschological Science, I thought statcheck deserved one. Also, this might settle the upper case/lower case confusion once and for all: it’s not Statcheck, StatCheck, statCheck, or even STATCHECK, but:

statcheck-01.jpg

Leamer-Rosenthal Prize for statcheck

I’m honored and proud to announce that Sacha Epskamp and I won the 2016 Leamer-Rosenthal Prize for Open Social Science! This award is an initiative of the Berkeley Initiative for Transparency in the Social Sciences (BITSS), and comes with a prize of
$10,000.

bitss-55a55026v1_site_icon.png

We are invited to attend the 2016 BITSS annual meeting to receive our prize, along with
seven other researchers and educators.

For more information about our nomination and the prize itself see: http://www.bitss.org/2016/12/02/sneak-peek-announcing-two-of-nine-winners-of-the-2016-leamer-rosenthal-prizes-for-open-social-science-michele-nuijten-and-sacha-epskamp/

 

statcheck 1.2.2 now on CRAN & statcheck manual on RPubs

The new statcheck 1.2.2* is now on CRAN!

Main updates:

  • Improved the regular expressions to avoid that statcheck wrongly recognizes weird statistics with subscripts as chi-squares
  • You can now choose whether to count “p = .000” as incorrect (this was default in the previous version)
  • The statcheck plot function now renders a plot in APA style (thanks to John Sakaluk for writing this code!)
  • Give pop-up window to choose a file when there is no file specified in “checkPDF()” or “checkHTML()”

For the full list of adaptations, see the History page on GitHub.

Besides the new updated package, I also created a detailed manual with instructions for installation and use of statcheck, including many examples and explanation of the output. You can find the manual on RPubs here.

* For the people who actually know what this numbering stands for: you may have noticed that the previous version on CRAN was version 1.0.2, so this seems like a weird step. It is. It’s because at first I had no idea what these numbers stood for (MAJOR.MINOR.PATCH), so I was just adding numbers at random. Actually the previous version should have been 1.1.x, which means that I’m now at 1.2.x. The last two PATCHES were because I messed up the R CMD check and had to fix some last minute things 🙂