Growing demand for rigorous evidence on policy impacts and costs has driven increased investment in data-intensive social science research. Yet the incentives, norms, and institutions that govern the way research is practiced do not always promote openness, integrity, or relevance, resulting in an evidence base that is incomplete at best, and misleading at worst. CEGA’s flagship transparency and reproducibility initiative, the Berkeley Initiative for Transparency in the Social Sciences (BITSS), advances practices, tools, and policies to strengthen the integrity of research, while carefully studying how researchers and policymakers respond to and value transparency and reproducibility. Similarly, through our Open Policy Analysis (OPA) work and Cost Transparency Initiative (CTI), we are developing new standards, tools, and “best practice” approaches to facilitate the reporting and delivery to policy makers of evidence that is more transparent, relevant, and rigorously produced. With higher quality evidence in hand, they can in turn spend less time assessing credibility and more time making informed decisions. We leverage these tools and approaches to train researchers and policymakers, build communities of practice, develop wide-reaching institutional policies, and build capacity to produce and utilize high quality evidence.
Established in 2012, the Berkeley Initiative for Transparency in the Social Sciences (BITSS) is a growing, interdisciplinary network of researchers and institutions committed to improving scientific integrity through the promotion of transparent and reproducible research practices. Through funded and conducted research projects, trainings, and dissemination events, BITSS works to identify best practices and useful tools for ethical and rigorous data collection, analysis, and reporting. These include study registries, pre-analysis plans, disclosure standards, version control and data sharing...
Motivation Cost evidence is essential for policymakers to decide how to allocate scarce resources across impactful programs. Yet, fewer than one in five impact evaluations integrates cost evidence—such as by including a cost effectiveness analysis (CEA) or cost-benefit analysis (CBA) of the evaluated program (Brown and Tanner, 2019). The lack of attention to costing from researchers limits the influence of impact evidence by making it less relevant to decision maker’s concerns. Moreover, a dearth of costing evidence inhibits policymakers’ ability to identify and implement the most...
CEGA is excited to join the World Bank, as part of a global consortium of institutions, to design and populate the Impact Data and Evidence Aggregation Library (IDEAL). IDEAL will allow users to easily search, compare, and visualize relative effect sizes from randomized evaluations for a wide range of interventions deployed in low- and middle-income countries. IDEAL can facilitate quantitative meta-analyses, as well as more qualitative systematic reviews, whether such efforts are centered around an intervention, policy, population, or...
Demand for evidence-based policy and decision-making has grown in recent decades. But while policy-oriented research has become more rigorous and transparent, policy analysis—which contextualizes evidence to inform specific decisions, rather than generate new evidence—remains largely opaque. Open Policy Analysis (OPA) improves the credibility and reusability of policy reports by opening up their underlying data, code, and materials, and providing clear accounts of methodological decisions. By adapting and applying open science tools, methods, and practices, OPA facilitates collaboration,...
Copyright 2024. All Rights Reserved
Design & Dev by Wonderland Collective