Court systems across the world face numerous challenges in providing fair and efficient justice to citizens and firms. Challenges include legal codes (burdened with formalism and out of touch with local languages), judges who may be biased (against plaintiffs from certain groups), and overall capacity (to manage high caseloads, decisions appeals, and resulting backlogs). Biased or slow rulings from judges have serious economic and welfare consequences, including a harmful environment for business.
Governments are increasingly transitioning to ‘smart’ courts, which feature advanced informational technology, audiovisual capabilities, and integrated electronic case management systems. These innovations capture enormous amounts of new, high-frequency administrative data that has yet to be leveraged. ‘Smart’ court data could, for example, be analyzed to understand the extent of problems with fairness and efficiency, help judges and other court employees overcome behavioral obstacles that impede performance, and even form the basis for new incentive schemes to encourage just and efficient proceedings. Researchers worked separately in India and Kenya to study how administrative data can be used in these capacities.
This study provides empirical evidence on the extent of discrimination and delay in Indian courts, focusing on unequal treatment due to corruption and due to prejudice against disadvantaged groups (based on gender, religion, and caste). The researchers constructed a new dataset on the universe of judicial proceedings in Indian courts. This dataset, the first of its kind in a low- or middle-income country, includes digitized records of roughly 80 million cases over a 110-year period. Adapting machine learning methods, researchers predicted judicial outcomes based on case characteristics (such as the type of felony, the plaintiff’s legal history, etc.), and then identified cases with divergent rulings, labelling them as potential sources of errors or bias. The researchers also analyzed implicit biases expressed in the language text written by judges to understand whether, holding case characteristics constant, judges discriminate against women, Muslims, and members of certain castes in their rulings.
Kenyan courts are struggling to tackle a half-million case backlog. Alternative dispute resolution approaches, such as referring cases to mediators rather than waiting for judges, could help address the backlog issue and enhance the services that vulnerable populations receive to resolve their disputes. The researchers are currently collaborating with the Kenyan Mediation Committee to develop a case management and analytics platform that facilitates more optimal allocation of cases to mediators. Researchers are exploring opportunities to evaluate new interventions made possible through the platform given its potential to streamline administrative workflow and leverage AI-powered decision support to improve the efficiency and effectiveness of alternative dispute resolution mechanisms in Kenya.
Similar to their work in India, the research team is also leveraging publicly available, historical data from Kenya to evaluate the persistence of gender and ethnic biases in judicial decisions.
In a working paper, they estimate precisely that judges are not biased towards their “in-group” along their own gender or religious identity (i.e., Muslim judges are not systematically ruling more favorably for Muslim defendants). These results do not rule out that there is bias in the Indian legal system entirely. Rather, this evidence could be useful for policymakers in India when deciding where to allocate resources in addressing discrimination and social disadvantage. There are likely other parts of the legal system, besides the acquittal choices of judges, where efforts would be more beneficial.
Results forthcoming.
Copyright 2024. All Rights Reserved
Design & Dev by Wonderland Collective