Subscribe to E-Bulletin Donate to CEGA

Research Transparency

In the last decade, there has been prodigious growth in the demand for evidence to inform policy design. This, in turn, has driven investment in data-intensive social science research. Yet the incentives, norms, and institutions that govern economics, political science, and related disciplines do not always promote openness and integrity. For example, novel and positive results are published more frequently than replications, null results, or perplexing outcomes. This leads to a biased and incomplete record of research. This is especially problematic when research evidence is used in high-level decision-making, since government policies can affect millions of people over many years.

Our Approach

To promote open, reproducible research practices within the scientific community, CEGA is working across disciplines to identify the most useful strategies and tools for data collection, analysis, and reporting. This includes the use of study registries, pre-analysis plans, disclosure standards, version control and data sharing platforms, and replication projects.

Berkeley Initiative for Transparency in the Social Sciences

The Berkeley Initiative for Transparency in the Social Sciences (BITSS) is a growing, interdisciplinary network of researchers and institutions committed to improving scientific integrity. BITSS is training the next generation in the adoption of emerging software and methods; the initiative also sponsors annual workshops to generate consensus and standards across the social sciences. The end goal is to strengthen the scientific evidence on which social and economic policies are based.

Research Highlights

Transparency in Practice: Community-Driven Development in Sierra Leone

At the inception of a randomized trial in Sierra Leone (testing the effectiveness of a post-war village development program), CEGA and J-PAL researchers agreed on a set of hypotheses they would test as part of their experiment. They published a detailed document outlining their planned analyses, before reviewing their field data. This “pre-analysis plan” prevented the research team from selectively reporting only the most positive results (out of 318 outcome variables collected).

Publication Bias and P-hacking

Because scientists tend to report only on studies or analyses that worked, readers must ask, “Are the effects of this intervention real, or have the authors selectively reported the positive findings?” The p-curve is a tool to help answer this question. It is a statistical technique developed by BITSS affiliates to detect whether study data have been manipulated to reach a higher level of statistical significance.

Promoting Transparency in Social Science Research

In 2014, BITSS affiliates published a call to improve the quality and credibility of research in the social sciences. The authors advocate for research practices that realign academic incentives with the scholarly values for truth and transparency. However, new norms must be implemented in a way that does not stifle creativity or create excessive burdens for researchers.