Imagine that Eric, a young man, has been convicted of a gang-related crime: he was found by police at the scene of a robbery carried out by his friends. The sentencing judge now needs to make a decision. Not knowing whether Eric poses a risk to the public, she turns to an algorithmic risk assessment - a set of rules, developed on the basis of correlations between individual characteristics and criminal activity, which predicts the likelihood of recidivism. Eric is a conscientious citizen, who has never before been in trouble with the law. However, he was raised in foster care, in an area with high rates of crime, poverty, and residential instability- facts to which the algorithm attributes a high risk score. The judge recommends a sentence of the maximum possible duration, with extended post-release supervision. Why does this matter? The answer most often given is that it matters because of inequality, captured through the language of 'bias' or 'discrimination': the effect of using algorithms can be to exacerbate unjustified differences between people, on the basis of considerations such as race, sex, or socio-economic circumstance. Using a diverse set of case studies, and making clear policy recommendations along the way, Artificial Justice unpacks the reasons that we might have to object to the use of statistical algorithms to allocate the burdens of policy decisions. It argues that these reasons extend beyond egalitarian concerns. Importantly, they include reasons that stem from the value of individual choice - of having the chance to affect what happens to us by choosing appropriately and being equipped to exercise those choices well. The book explores the substantive reasons that contribute to a picture of what's at stake for individuals when we use statistical algorithms to make decisions about how to treat others, and makes robust policy recommendations about the scope and nature of human-algorithm interaction. Artificial Justice is a compelling and accessible text, which offers a great deal to a wide and interdisciplinary audience of academics, students, and those otherwise interested in learning about algorithmic justice.
Dieser Download kann aus rechtlichen Gründen nur mit Rechnungsadresse in A, B, BG, CY, CZ, D, DK, EW, E, FIN, F, GR, HR, H, IRL, I, LT, L, LR, M, NL, PL, P, R, S, SLO, SK ausgeliefert werden.