Racially biased outcomes have increasingly been recognized as a problem that can infect software algorithms and datasets of all types. Digital platforms, in particular, are organizing ever greater portions of social, political, and economic life. This essay examines and organizes current academic and popular press discussions on how digital tools, despite appearing to be objective and unbiased, may, in fact, only reproduce or, perhaps, even reinforce current racial inequities. However, digital tools may also be powerful instruments of objectivity and standardization. Based on a review of the literature, we have modified and extended a “value chain–like” model introduced by Danks and London (2017), depicting the potential location of ethnic bias in algorithmic decision-making. The model has five phases: input, algorithmic operations, output, users, and feedback. With this model, we identified nine unique types of bias that might occur within these five phases in an algorithmic model: (1) training data bias, (2) algorithmic focus bias, (3) algorithmic processing bias, (4) transfer context bias, (5) misinterpretation bias, (6) automation bias, (7) non-transparency bias, (8) consumer bias, and (9) feedback loop bias. In our discussion, we note some potential benefits from the movement of decisions online, as they are then traceable and amenable to analysis. New social challenges arise as algorithms, and digital platforms that depend on them organize increasingly large portions of social, political, and economic life. Formal regulations, public awareness, and additional academic research are crucial, as algorithms will make or frame decisions, often without awareness by either the creators of the algorithms or those affected by them of biases that might affect those decisions.
September 27, 2018
Selena, Silva and Kenney, Martin, Algorithms, Platforms, and Ethnic Bias: An Integrative Essay (August 21, 2018). Phylon: The Clark Atlanta University Review of Race and Culture, Forthcoming , Available at SSRN: https://ssrn.com/abstract=3246252