Meta-analyses are an important tool to evaluate the literature. It is essential that meta-analyses can easily be reproduced to allow researchers to evaluate the impact of subjective choices on meta-analytic effect sizes, but also to update meta-analyses as new data comes in, or as novel statistical techniques (for example to correct for publication bias) are developed. Research in medicine has revealed meta-analyses often cannot be reproduced.
Daniel Lakens and colleagues examined the reproducibility of meta-analyses in psychology by reproducing twenty published meta-analyses. They find that 96% of meta-analyses published in 2013-2014 did not adhere to reporting guidelines. A third of these meta-analyses did not contain a table specifying all individual effect sizes. Five of the 20 randomly selected meta-analyses they attempted to reproduce could not be reproduced at all due to lack of access to raw data, no details about the effect sizes extracted from each study, or a lack of information about how effect sizes were coded. In the remaining meta-analyses, differences between the reported and reproduced effect size or sample size were common.
The authors suggest a range of possible improvements, including better disclosure of data that were used to calculate an effect size, disclosure of all individual effect sizes, detailed information about equations (and how multiple effect size estimates from the same study are combined), and raw data retrieved from original authors, or unpublished research reports.
Copyright 2019. All Rights Reserved
Design & Dev by Wonderland Collective