METAxDATA Meeting at QUEST, Berlin

Last month, the QUEST center in Berlin organized the first METAxDATA meeting on building automated screening tools for data-driven meta-research. On the first night of the meeting, 13 researchers gave lightning talks about their tools. The clip below features my <2 minute lightning talk about statcheck.

All lightning talks were recorded and can be found here.

Open Software for Open Science

At the Solid Science Workshop in Bordeaux (September 6-7, 2018), I gave a workshop about free software to facilitate solid research practices. During this workshop, we collaboratively worked on a list of resources/software/tools that can be used to improve different stages of the research process.

Check out the list, share it with colleagues, or add your own resources to it here: https://bit.ly/opensciencesoftware.

The slides of the workshop can be found here: https://osf.io/s8wpz/.

empcycle

In Press: Practical Tools and Strategies for Researchers to Increase Replicability

I wrote an invited review for Developmental Medicine & Child Neurology about “Practical tools and strategies for researchers to increase replicability”.

Problems with replicability have been widely discussed over the last years, especially in psychology. By now, a lot of promising solutions have been proposed, but my sense is that researchers are sometimes a bit overwhelmed by all the possibilities.

My goal in this review was to make a list of some of the current recommendations that can be easily implemented. Not every solutions is always feasible for every project, so my advice is: copy best practices from other fields, see what works on a case-by-case basis, and improve your research step by step.

The preprint can be found here: https://psyarxiv.com/emyux.

Dr. Nuijten

Wednesday May 30, 2018, I successfully defended my PhD thesis, which means that I can now finally call myself Dr. Nuijten!

85

I thank my promotors Jelte Wicherts and Marcel van Assen for all their advice over the last 5 years, and my committee – Chris Chambers, Eric-Jan Wagenmakers, Rolf Zwaan, and Marjan Bakker – for their interesting (and fun!) questions.

My full thesis “Research on research: A meta-scientific study of problems and solutions in psychological science” can be found here.

99.jpg

New Preprint: Effect Sizes, Power, and Biases in Intelligence Research

Our new meta-meta-analysis on intelligence research is now online as a preprint at https://psyarxiv.com/ytsvw.

We analyzed 131 meta-analyses in intelligence research to investigate effect sizes, power, and patterns of bias. We find a typical effect of r = .26 and a median sample size of 60.

The median power seems low (see figure below), and we find evidence for small study effects, possibly indicating overestimated effects. We don’t find evidence for a US effect, decline or early-extremes effect, or citation bias.

MedianPowerPerTypeAndOverallRandomEffects

Comments are very welcome and can be posted on the PubPeer page https://pubpeer.com/publications/9F209A983618EFF9EBED07FDC7A7AC.