Seed Funding for COVID-19 Project

I am happy to announce that Robbie van Aert, Jelte Wicherts, and I received seed funding from the Herbert Simon Research Institute for our project to screen COVID-19 preprints for statistical inconsistencies.

Inconsistencies can distort conclusions, but even if inconsistencies are small, they negatively affect the reproducibility of a paper (i.e., where did a number come from?). Statistical reproducibility is a basic requirement for any scientific paper.

We plan to check a random sample of COVID-19 preprints from medRxiv and bioRxiv for several types of statistical inconsistencies. E.g., does a percentage match the accompanying fraction? Do the TP/TN/FP/FN rates match the reported sensitivity of a test?

We have 3 main objectives:

  1. Post short reports with detected statistical inconsistencies underneath the preprint
  2. Assess the prevalence of statistical inconsistencies in COVID-19 preprints
  3. Compare the inconsistency-rate in COVID-19 preprints with the inconsistency-rate in similar preprints on other topics

We hypothesize that high time pressure may have led to a higher prevalence of statistical inconsistencies in COVID-19 preprints as opposed to preprints on less time sensitive issues.

We thank our colleagues at the Meta-Research Center for their feedback and help in developing the coding protocol.

See the full proposal here.

New Paper: Reproducibility of Individual Effect Sizes in Psychological Meta-Analyses

I am happy to announce that our paper “Reproducibility of individual effect sizes in meta-analyses in psychology” was published in PLoS One (first-authored by Esther Maassen). In this study, we assessed 500 primary effect sizes from 33 psychology meta-analyses. Reproducibility was problematic in 45% of the cases (see Figure below for different causes). We strongly recommend meta-analysts to share their data and code.

graph

Top Downloaded Paper

I am very happy to announced that my paper “Practical tools and strategies for researchers to increase replicability” was listed as a Top Download for the journal Developmental Medicine & Child Neurology.

The paper lists an overview of concrete actions researchers can undertake to improve the openness, replicability, and overall robustness of their work.

I hope that the high number of downloads indicate that many researchers were able to cherry-pick open practices that worked for their situation.

Read the full paper (open access) here.

wiley_certificate

METAxDATA Meeting at QUEST, Berlin

Last month, the QUEST center in Berlin organized the first METAxDATA meeting on building automated screening tools for data-driven meta-research. On the first night of the meeting, 13 researchers gave lightning talks about their tools. The clip below features my <2 minute lightning talk about statcheck.

All lightning talks were recorded and can be found here.

Open Software for Open Science

At the Solid Science Workshop in Bordeaux (September 6-7, 2018), I gave a workshop about free software to facilitate solid research practices. During this workshop, we collaboratively worked on a list of resources/software/tools that can be used to improve different stages of the research process.

Check out the list, share it with colleagues, or add your own resources to it here: https://bit.ly/opensciencesoftware.

The slides of the workshop can be found here: https://osf.io/s8wpz/.

empcycle

In Press: Practical Tools and Strategies for Researchers to Increase Replicability

I wrote an invited review for Developmental Medicine & Child Neurology about “Practical tools and strategies for researchers to increase replicability”.

Problems with replicability have been widely discussed over the last years, especially in psychology. By now, a lot of promising solutions have been proposed, but my sense is that researchers are sometimes a bit overwhelmed by all the possibilities.

My goal in this review was to make a list of some of the current recommendations that can be easily implemented. Not every solutions is always feasible for every project, so my advice is: copy best practices from other fields, see what works on a case-by-case basis, and improve your research step by step.

The preprint can be found here: https://psyarxiv.com/emyux.

Dr. Nuijten

Wednesday May 30, 2018, I successfully defended my PhD thesis, which means that I can now finally call myself Dr. Nuijten!

85

I thank my promotors Jelte Wicherts and Marcel van Assen for all their advice over the last 5 years, and my committee – Chris Chambers, Eric-Jan Wagenmakers, Rolf Zwaan, and Marjan Bakker – for their interesting (and fun!) questions.

My full thesis “Research on research: A meta-scientific study of problems and solutions in psychological science” can be found here.

99.jpg