Sections

Research

Algorithmic exclusion: The fragility of algorithms to sparse and missing data

Algorithm codes
Editor's note:

This is a Brookings Center on Regulation and Markets working paper.

Abstract

This paper introduces the idea of ‘algorithmic exclusion’ as a source of the persistence of inequality. Algorithmic exclusion refers to outcomes where people are excluded from algorithmic processing, meaning that the algorithm cannot make a prediction about them. This occurs because the conditions that lead to societal inequality can also lead to bad or missing data that renders algorithms unable to make successful predictions. This paper argues that algorithmic exclusion is widespread and that its consequences are significant.

Download the full working paper here.


Catherine Tucker has served as a consultant for numerous technology companies, a full list of which can be found here. The author did not receive financial support from any firm or person for this article or, other than the aforementioned, from any firm or person with a financial or political interest in this article. The author is not currently an officer, director, or board member of any organization with a financial or political interest in this article.

Author