Recently, academic organisations in the Netherlands have been discussing how we can improve the system of Recognition and Rewards for scientists. In a short interview for Tilburg University, I explain my hope that rewarding Open Science can benefit both science and scientists.
In a comment in Nature that came out today, we address the potential problem of conflicts of interest in psychology awards. We went over the websites of 58 psychological societies, and found that the large majority did not mention any conflict of interest policy with respect to awards. This means we can’t rule out the possibility that recipients of the awards may be closely affiliated with the award committee (e.g., a supervisor selecting his/her PhD student for an award). We urge societies to be open about their award procedures, to avoid the impression of hidden nepotism.
We thank everyone who helped us coding the websites at the 2019 SIPS meeting.
Read the Nature comment here.
Read the working paper here.
Yesterday, I was part of the discussion panel at the “KennisCafé: De Foute Avond”. The KennisCafé is a monthly night at de Balie, Amsterdam, where scientists and other experts discuss a certain topic for a laymen’s audience. This night, the theme was: problems and mistakes in science.
Together with panel members Lotty Hooft (director Cochrane Netherlands), Lex Bouter (professor in scientific integrity and methodology), and Paul Iske (“Chief Failure Officer”), I discussed topics such as statistical mistakes, problems with replication, and possible directions for solutions.
The livestream (in Dutch) can be found here:
The KennisCafé is a production of NEMO Science Museum, KNAW, de Volkskrant, and de Balie. More information can be found here.
In a recent letter to the editor in the Chronicle, we reply to an earlier article that
presented the open science movement as “burning things to the ground”. We disagreed. We mainly see cooperative, constructive, and pragmatic initiatives to improve the state of psychological science.
Read the full letter here.
Chartier, C. R., Kline, M. E., McCarthy, R. J., Nuijten, M. B., Dunleavy, D. J., & Ledgerwood, A. A cooperative revolution in psychology. The Chronicle of Higher Education.
Saturday June 30, I was interviewed about my dissertation for the Dutch radio show “Dr Kelder & Co”, for NPO Radio 1. The main takeaways: scientists are also just people, psychology is heading into the right direction, and trains don’t always do what you want.
Listen to the whole interview (in Dutch) here.
In the latest Science Insider written by Dalmeet Singh Chawla I argue that statcheck does exactly what it’s supposed to do: check the consistency of APA reported NHST results.
Read the entire piece here.
Nature published a series of comments all focused on ways to fix our statistics. In my comment, I argue that the main problem is the flexibility in data analysis, combined with incentives to find significant results. A possible solution would be to preregister analysis plans, and to share data.
Read the entire piece here, for the following set of solutions:
- Jeff Leek: Adjust for human cognition
- Blake McShane, Andrew Gelman, David Gal, Christian Robert, and Jennifer Tackett: Abandon statistical significance
- David Colquhoun: State false-positive risk, too
- Michèle Nuijten: Share analysis plans and results
- Steven Goodman: Change norms from within
Publons announced the winner of the Sentinel Award for outstanding advocacy, innovation or contribution to scholarly peer review, and I am proud to announce that statcheck was crowned runner-up!
I am honored that the judges considered statcheck a useful contribution to the peer review system. In the end, one of the things I hope to achieve is that all Psychology journals will consider it standard practice to quickly “statcheck” a paper for statistical inconsistencies to avoid publishing them.
A very warm congratulations to the winner of the award: Irene Hames. Irene spent most of her career on improving the quality of peer review and it is great that her work is recognized in this way! Also congratulations to the rest of the Sentinel Award nominees: Retraction Watch, American Geophysical Union, ORCiD, F1000Research, The Committee on Publication Ethics (COPE), Kyle Martin and Gareth Fraser.
For more information about the award, the winner, and the finalists, see this page.
This week the Guardian’s Science Weekly podcast focuses on statistical malpractice and fraud in science. We talk about the role of statcheck in detecting statistical inconsistencies, and discuss the causes and implications of seemingly innocent rounding errors.
This podcast also offers fascinating insights from consultant anaesthetist John Carlisle about the detection of data fabrication, and president of the Royal Statistical Society David Spiegelhalter about the dangers of statistical malpractice.