I am happy to announce that our paper “Reproducibility of individual effect sizes in meta-analyses in psychology” was published in PLoS One (first-authored by Esther Maassen). In this study, we assessed 500 primary effect sizes from 33 psychology meta-analyses. Reproducibility was problematic in 45% of the cases (see Figure below for different causes). We strongly recommend meta-analysts to share their data and code.
I am very happy to announced that my paper “Practical tools and strategies for researchers to increase replicability” was listed as a Top Download for the journal Developmental Medicine & Child Neurology.
The paper lists an overview of concrete actions researchers can undertake to improve the openness, replicability, and overall robustness of their work.
I hope that the high number of downloads indicate that many researchers were able to cherry-pick open practices that worked for their situation.
Read the full paper (open access) here.
On February 21st 2020, I gave an online talk for the Webcast Series on Transparency from Project TIER on how to efficiently assess and improve the robustness of scientific findings in four steps. The full talk can be found below. More details are posted on the Project TIER website. Also check out the other talks in these series here.
Last month, the QUEST center in Berlin organized the first METAxDATA meeting on building automated screening tools for data-driven meta-research. On the first night of the meeting, 13 researchers gave lightning talks about their tools. The clip below features my <2 minute lightning talk about statcheck.
All lightning talks were recorded and can be found here.
At the Solid Science Workshop in Bordeaux (September 6-7, 2018), I gave a workshop about free software to facilitate solid research practices. During this workshop, we collaboratively worked on a list of resources/software/tools that can be used to improve different stages of the research process.
Check out the list, share it with colleagues, or add your own resources to it here: https://bit.ly/opensciencesoftware.
The slides of the workshop can be found here: https://osf.io/s8wpz/.
I wrote an invited review for Developmental Medicine & Child Neurology about “Practical tools and strategies for researchers to increase replicability”.
Problems with replicability have been widely discussed over the last years, especially in psychology. By now, a lot of promising solutions have been proposed, but my sense is that researchers are sometimes a bit overwhelmed by all the possibilities.
My goal in this review was to make a list of some of the current recommendations that can be easily implemented. Not every solutions is always feasible for every project, so my advice is: copy best practices from other fields, see what works on a case-by-case basis, and improve your research step by step.
The preprint can be found here: https://psyarxiv.com/emyux.
Wednesday May 30, 2018, I successfully defended my PhD thesis, which means that I can now finally call myself Dr. Nuijten!
I thank my promotors Jelte Wicherts and Marcel van Assen for all their advice over the last 5 years, and my committee – Chris Chambers, Eric-Jan Wagenmakers, Rolf Zwaan, and Marjan Bakker – for their interesting (and fun!) questions.
My full thesis “Research on research: A meta-scientific study of problems and solutions in psychological science” can be found here.
In a new paper, we ran statcheck on a bunch of experimental philosophy papers. Inconsistency rates are lower than in psychology, and evidential value seems high. Good news for the philosophers! See the full paper here.