In a new paper, we ran statcheck on a bunch of experimental philosophy papers. Inconsistency rates are lower than in psychology, and evidential value seems high. Good news for the philosophers! See the full paper here.
Category Archives: meta-science
New Preprint: Effect Sizes, Power, and Biases in Intelligence Research
Our new meta-meta-analysis on intelligence research is now online as a preprint at https://psyarxiv.com/ytsvw.
We analyzed 131 meta-analyses in intelligence research to investigate effect sizes, power, and patterns of bias. We find a typical effect of r = .26 and a median sample size of 60.
The median power seems low (see figure below), and we find evidence for small study effects, possibly indicating overestimated effects. We don’t find evidence for a US effect, decline or early-extremes effect, or citation bias.
Comments are very welcome and can be posted on the PubPeer page https://pubpeer.com/publications/9F209A983618EFF9EBED07FDC7A7AC.
Paper Accepted @ Collabra: Psychology
Our paper “Journal data sharing policies and statistical reporting inconsistencies in psychology” has been accepted for publication in the open access journal Collabra: Psychology!
The updated (accepted) pre-print can be found on PsyArXiv: https://psyarxiv.com/sgbta.
LSE Impact Blog on Data Sharing
New Preprint: Data Sharing & Statistical Inconsistencies
We just published the preprint of our new study “Journal Data Sharing Policies and Statistical Reporting Inconsistencies in Psychology” at https://osf.io/preprints/psyarxiv/sgbta.
In this paper, we ran three independent studies to investigate if data sharing is related to fewer statistical inconsistencies in a paper. Overall, we found no relationship between data sharing and reporting inconsistencies. However, we did find that journal policies on data sharing are extremely effective in promoting data sharing (see the Figure below).
We argue that open data is essential in improving the quality of psychological science, and we discuss ways to detect and reduce reporting inconsistencies in the literature.
Awarded a Campbell Methods Grant
I am honored to announce that Joshua R. Polanin and I were awarded a $20,000 methods grant from the Campbell Collaboration for the project “Verifying the Accuracy of Statistical Significance Testing in Campbell Collaboration Systematic Reviews Through the Use of the R Package statcheck”.
The grant is part of the Campbell Collaboration’s program to supporting innovative methods development in order to improve the quality of systematic reviews. It is great that we (and statcheck!) can be a part of this effort.
For more information about the grant and the three other recipients, see their website here.
The Meta-Research Center in The Guardian
The long-read in the Guardian today by Stephen Buranyi featured our work at the Meta-Research Center. Specifically, it focuses on the work of Chris Hartgerink and Marcel van Assen on detecting fabricated data, and how the development and use of statcheck played a role in their research.
Read the full article here.
Join our seminar “Improving Scientific Practice: Dealing with the Human Factors”, September 11, 2014, Amsterdam.
September 2014
This seminar will take a positive approach and will focus on practical solutions to dealing with human factors in the scientific enterprise. Highly recognized scientists from various research areas who are known for their active involvement in and contributions to the improvement of scientific practice will share their expertise and offer feasible ways to advance the way we do science.
Keynote addresses: John Ioannidis and Melissa Anderson
For more information and registration: www. human-factors-in-science.com