Home

I am an Assistant Professor at Tilburg University. My research focuses on meta-science, including topics such as replication, publication bias, statistical errors, and questionable research practices.

I am part of the Meta-Research Center at Tilburg University: http://metaresearch.nl.

Contact
Email: m.b.nuijten@uvt.nl
Work phone: (+31) (0) 13 466 2053

linkedintwitterround-google-sholar

researchgate-logo

github

 

NWO Veni Grant for the 4-Step Robustness Check

I’m thrilled to announce that I won a €250,000 NWO Veni Grant for my 4-Step Robustness Check! The next 3 years I’ll be working on methods to assess and improve robustness of psychological science.

To check the robustness of a study could replicate it in a new sample. However, in my 4-Step Robustness Check, you first verify if the reported numbers in the original study are correct. If they’re not, they are not interpretable and you can’t compare them to the results of your replication.

Specifically, I advise researchers to do the following:

  1. Check if there are visible errors in the reported numbers, for example by running a paper through my spellchecker for statistics: statcheck
  2. Reanalyze the data following the original strategy to see if this leads to the same numbers
  3. Check if the result is robust to alternative analytical choices
  4. Perform a replication study in a new sample
The 4-Step Robustness Check can be used to efficiently assess robustness of results

This 4-step check provides an efficient framework to check if a study’s findings are robust. Note that the first steps take way less time than a full replication and might be enough to conclude a result is not robust.

The proposed framework can also be used as an efficient checklist for researchers to improve robustness of their own results:

  1. Check the internal consistency of your reported results
  2. Share your data and analysis scripts to facilitate reanalysis
  3. Conduct and report your own sensitivity analyses
  4. Write detailed methods sections and share materials to facilitate replication

Ultimately, I aim to create interactive, pragmatic, and evidence-based methods to improve and assess robustness, applicable to psychology and other fields.

I would like to wholeheartedly thank my colleagues, reviewers, and committee members for their time, feedback, and valuable insights. I’m looking forward to the next three years!

Seed Funding for COVID-19 Project

I am happy to announce that Robbie van Aert, Jelte Wicherts, and I received seed funding from the Herbert Simon Research Institute for our project to screen COVID-19 preprints for statistical inconsistencies.

Inconsistencies can distort conclusions, but even if inconsistencies are small, they negatively affect the reproducibility of a paper (i.e., where did a number come from?). Statistical reproducibility is a basic requirement for any scientific paper.

We plan to check a random sample of COVID-19 preprints from medRxiv and bioRxiv for several types of statistical inconsistencies. E.g., does a percentage match the accompanying fraction? Do the TP/TN/FP/FN rates match the reported sensitivity of a test?

We have 3 main objectives:

  1. Post short reports with detected statistical inconsistencies underneath the preprint
  2. Assess the prevalence of statistical inconsistencies in COVID-19 preprints
  3. Compare the inconsistency-rate in COVID-19 preprints with the inconsistency-rate in similar preprints on other topics

We hypothesize that high time pressure may have led to a higher prevalence of statistical inconsistencies in COVID-19 preprints as opposed to preprints on less time sensitive issues.

We thank our colleagues at the Meta-Research Center for their feedback and help in developing the coding protocol.

See the full proposal here.

New Paper: Reproducibility of Individual Effect Sizes in Psychological Meta-Analyses

I am happy to announce that our paper “Reproducibility of individual effect sizes in meta-analyses in psychology” was published in PLoS One (first-authored by Esther Maassen). In this study, we assessed 500 primary effect sizes from 33 psychology meta-analyses. Reproducibility was problematic in 45% of the cases (see Figure below for different causes). We strongly recommend meta-analysts to share their data and code.

graph

Top Downloaded Paper

I am very happy to announced that my paper “Practical tools and strategies for researchers to increase replicability” was listed as a Top Download for the journal Developmental Medicine & Child Neurology.

The paper lists an overview of concrete actions researchers can undertake to improve the openness, replicability, and overall robustness of their work.

I hope that the high number of downloads indicate that many researchers were able to cherry-pick open practices that worked for their situation.

Read the full paper (open access) here.

wiley_certificate

METAxDATA Meeting at QUEST, Berlin

Last month, the QUEST center in Berlin organized the first METAxDATA meeting on building automated screening tools for data-driven meta-research. On the first night of the meeting, 13 researchers gave lightning talks about their tools. The clip below features my <2 minute lightning talk about statcheck.

All lightning talks were recorded and can be found here.

Nature Comment: Rule Out Conflicts of Interest in Psychology Awards

nature

In a comment in Nature that came out today, we address the potential problem of conflicts of interest in psychology awards. We went over the websites of 58 psychological societies, and found that the large majority did not mention any conflict of interest policy with respect to awards. This means we can’t rule out the possibility that recipients of the awards may be closely affiliated with the award committee (e.g., a supervisor selecting his/her PhD student for an award). We urge societies to be open about their award procedures, to avoid the impression of hidden nepotism.

We thank everyone who helped us coding the websites at the 2019 SIPS meeting.

Read the Nature comment here.

Read the working paper here.

 

Guest at “KennisCafé: De Foute Avond”

Yesterday, I was part of the discussion panel at the “KennisCafé: De Foute Avond”. The KennisCafé is a monthly night at de Balie, Amsterdam, where scientists and other experts discuss a certain topic for a laymen’s audience. This night, the theme was: problems and mistakes in science.

Together with panel members Lotty Hooft (director Cochrane Netherlands), Lex Bouter (professor in scientific integrity and methodology), and Paul Iske (“Chief Failure Officer”), I discussed topics such as statistical mistakes, problems with replication, and possible directions for solutions.

The livestream (in Dutch) can be found here:

The KennisCafé is a production of NEMO Science Museum, KNAW, de Volkskrant, and de Balie. More information can be found here.

Teacher of the Year

I am proud and happy to announce that I was elected Teacher of the Year of Tilburg University.

In teaching, I hold on to a famous Dutch saying: “beter goed gejat, dan slecht bedacht”, or “it’s better to steal something good, than to come up with something bad”. There are so many smart people coming up with innovative, educational tips, tricks, and tools, that it doesn’t make sense (to me) to try and reinvent the wheel.

I’m always trying to improve my teaching and my courses, and I’m incredibly thankful that my students seem to notice that 🙂

teacheroftheyear

Tilburg University Dissertation Prize

Yesterday I was awarded the Tilburg University Dissertation Prize. It is a great honor, but I’m especially grateful because this as a sign that Tilburg University thinks it is good to be critical about the current scientific system, and that open science is an important step forward.

I would like to thank my advisors and collaborators, without whom this dissertation would not exist.

My full dissertation, “Research on Research: A Meta-Scientific Study of Problems and Solutions in Psychological Science”, can be downloaded here.

dissertationprize