At the Solid Science Workshop in Bordeaux (September 6-7, 2018), I gave a workshop about free software to facilitate solid research practices. During this workshop, we collaboratively worked on a list of resources/software/tools that can be used to improve different stages of the research process.
Check out the list, share it with colleagues, or add your own resources to it here: https://bit.ly/opensciencesoftware.
The slides of the workshop can be found here: https://osf.io/s8wpz/.
I wrote an invited review for Developmental Medicine & Child Neurology about “Practical tools and strategies for researchers to increase replicability”.
Problems with replicability have been widely discussed over the last years, especially in psychology. By now, a lot of promising solutions have been proposed, but my sense is that researchers are sometimes a bit overwhelmed by all the possibilities.
My goal in this review was to make a list of some of the current recommendations that can be easily implemented. Not every solutions is always feasible for every project, so my advice is: copy best practices from other fields, see what works on a case-by-case basis, and improve your research step by step.
The preprint can be found here: https://psyarxiv.com/emyux.
Saturday June 30, I was interviewed about my dissertation for the Dutch radio show “Dr Kelder & Co”, for NPO Radio 1. The main takeaways: scientists are also just people, psychology is heading into the right direction, and trains don’t always do what you want.
Listen to the whole interview (in Dutch) here.
Wednesday May 30, 2018, I successfully defended my PhD thesis, which means that I can now finally call myself Dr. Nuijten!
I thank my promotors Jelte Wicherts and Marcel van Assen for all their advice over the last 5 years, and my committee – Chris Chambers, Eric-Jan Wagenmakers, Rolf Zwaan, and Marjan Bakker – for their interesting (and fun!) questions.
My full thesis “Research on research: A meta-scientific study of problems and solutions in psychological science” can be found here.
My dissertation is finished!
The cover: my own desk, feat. SIPS, BITSS, and COS. Cover design by Niels Bongers.
The contents: statcheck, data sharing, meta-analysis, power, bias, and more.
You can find the full thesis here:
Our new meta-meta-analysis on intelligence research is now online as a preprint at https://psyarxiv.com/ytsvw.
We analyzed 131 meta-analyses in intelligence research to investigate effect sizes, power, and patterns of bias. We find a typical effect of r = .26 and a median sample size of 60.
The median power seems low (see figure below), and we find evidence for small study effects, possibly indicating overestimated effects. We don’t find evidence for a US effect, decline or early-extremes effect, or citation bias.
Comments are very welcome and can be posted on the PubPeer page https://pubpeer.com/publications/9F209A983618EFF9EBED07FDC7A7AC.
“The right to read is the right to mine”. That was the motto of yesterday’s meeting at the European Commission, where we discussed how new European copyright laws would affect text and data mining (TDM) research.
The new proposal would seriously impede the use of TDM for businesses; effectively they would not have the right to mine content they already have legal access to, which is of course very strange.
The proposal does include an exemption for non-commercial research organizations – which includes universities, and with that, my work – but this is still not sufficient. For one, it would prevent scientists to commercialize any breakthroughs based on TDM research. On top of that, an increasing number of scientists seaks collaboration with businesses (for example, to increase the chances of getting a Horizon 2020 grant).
For updates on this legislation, and more info on the TDM restrictions, see the website, including an open letter, of the European Alliance for Research Excellence (EARE).
In the latest Science Insider written by Dalmeet Singh Chawla I argue that statcheck does exactly what it’s supposed to do: check the consistency of APA reported NHST results.
Read the entire piece here.
Nature published a series of comments all focused on ways to fix our statistics. In my comment, I argue that the main problem is the flexibility in data analysis, combined with incentives to find significant results. A possible solution would be to preregister analysis plans, and to share data.
Read the entire piece here, for the following set of solutions:
- Jeff Leek: Adjust for human cognition
- Blake McShane, Andrew Gelman, David Gal, Christian Robert, and Jennifer Tackett: Abandon statistical significance
- David Colquhoun: State false-positive risk, too
- Michèle Nuijten: Share analysis plans and results
- Steven Goodman: Change norms from within
Illustration by David Parkins