il manifesto globalSubscribe for $1.99 / month and support our mission

Analysis

Are the algorithms discriminating? NYC aims to find out

The city of New York will analyze the computer algorithms it uses to manage certain operations to check for unintended discrimination. But the law does not require the city to make the code open source, leaving residents in the dark.

Are the algorithms discriminating? NYC aims to find out
Andrea Capocci
6 min read

The city of New York will set up a commission to analyze the algorithms that municipal offices are using for the provision of their services, in search of possible discrimination based on gender, age, religion or nationality. The committee will examine all the algorithms currently in use and decide for each case whether it is serving the public interest—or what types of discrimination it conceals.

The law establishing this task force was recently approved by the City Council at the initiative of Democratic city councilor James Vacca, and is waiting for the (expected) approval of Mayor Bill de Blasio. The city of New York is thus willing to apply the concept of “algorithm accountability,” given that, under their veneer of objectivity, these processes (mainly of an IT nature) that are helping individuals, businesses and institutions make decisions, are actually hiding politically relevant biases that have real-world consequences.

The word “algorithm” has taken on a generally negative connotation, as standing for the menacing technology that controls our lives. Actually, this Arabic word originally means any sequence of instructions for achieving a certain purpose: for instance, the recipe for Amatriciana pasta or instructions for assembling Legos are examples of completely unthreatening algorithms. The notion has become problematic only with the spread of computers, whose processes, because of the speed at which the algorithms are run and the amount of data they process, remain hidden from us. For example, whenever we run a Google search, we remain ignorant of the innumerable statistical and logical assessments that are happening on Google’s servers, which is to say the algorithm that the search engine is using.

However, each algorithm incorporates the errors and approximations introduced by its all-too-human author, and which often have consequences that are not “virtual” at all. In New York, people accused of violent crimes who are standing trial know this all too well. In 2006, the New York City crime lab started using a bioinformatics system called the Forensic Statistical Tool, which was supposed to determine the compatibility between the ethnicity and gender traits of the accused and those of the presumed offender from the biological evidence found at the crime scene. After an investigation by The New York Times and the investigative journalism website ProPublica, the errors of this software (occurring in 30 percent of all cases, with a large number of innocents sent to prison) have become public knowledge, and forced the New York police to stop using it and to make its code public after having retired it. The current law on algorithm accountability was born in the aftermath of that scandal but is meant to be applied to a much broader set of algorithms.

The U.S. judicial system, for instance, makes extensive use of automated algorithms to decide whether a defendant deserves parole or whether a child is likely being abused. And even more: In New York, just as in Italian cities, the relationship between citizens and public and private institutions is often regulated by algorithms. For example, they are determining the placement of teachers in particular schools, regulating the booking of medical examinations via telephone or triggering controls regarding employee absences.

Any one of these algorithms, even without the author’s knowledge, might be hiding discriminatory mechanisms. For example, given that, in the past, highly skilled jobs were mainly held by men, it is possible that a search engine would take this statistical correlation into account (but not its origin in discrimination), and show ads for less-skilled jobs to female users. In this way, discrimination tends to reproduce itself. This is not a theoretical example, but rather a finding from a 2015 study conducted at Carnegie Mellon University, to which we can add dozens of similar studies that have shown similar results.

In cities that are becoming more and more “smart,” the new law introduced in New York could set the standard, although many observers are harboring doubts about its real impact. As lawyer Julia Powles notes in The New Yorker, an early version of the law required that the algorithms used by public institutions should be published as open source, giving everyone the opportunity to know exactly what procedures are being followed.

But this ideal has been abandoned in favor of a milder version of the law. In reality, the companies that provide services to local authorities would not have agreed to publish their code, because their revenue largely depends on intellectual property and on the sale of licenses, not to mention the exploitation of the users’ personal information. And without the cooperation of the contractors, it is very difficult to ensure true transparency.

If not even the city of New York is able to use its bargaining power in exchange for such valuable data that concerns its inhabitants, we can easily imagine how much our poor local Italian authorities will be able to accomplish. However, European legislation contains some interesting ideas. Starting from May 2018, the General Data Protection Regulation developed by the European Commission will enter into force, not requiring further ratification at the national level. For instance, Article 15 of the Regulation establishes the right of a citizen to know about “the existence of automated decision-making, including profiling” and to obtain “meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject.” In turn, Article 22 guarantees the right “not to be subject to a decision based solely on automated processing.” However, this rule may be waived with the “explicit consent” of the user, and it is enough to look at our online behavior to understand that “explicit consent” often means nothing more than a hasty click.

Furthermore, according to legal experts, the true bulwark against the irresponsible use of algorithms could be given by its “readability,” i.e. the ability of the citizens to which the algorithm is applied to understand what data it uses and the procedures it applies to it. The European General Data Protection Regulation puts the citizens in a position to demand the readability of algorithms, as Gianclaudio Malgieri (Free University of Brussels) and John Comandé (St. Anne School Pisa) write in the latest issue of the journal International Data Privacy Law. However, the two researchers admit that trade secrets might restrict the application of the regulation, just like in New York. Others, more pessimistic, believe that the regulation will not be enough. For the moment, the question remains open, and the decision of the city of New York still represents an important signal being sent.

Algorithm accountability could also rekindle the debate on the use of open source information technologies in public administration. At least in theory, Italian public institutions needed to be moving in this direction ever since the 2000 Stanca Directive. However, things are proceeding in slow motion. Until now, the main motivation behind the directive was the need to save money on the part of the state, because open source software is mostly free. But the marketing policies of large computer companies (first and foremost, Microsoft), and the spread of software piracy even in public offices, have made this argument mostly moot. Now, the issue of algorithm transparency is raising the stakes and giving the issue a new and certainly more interesting political dimension.


Originally published at https://ilmanifesto.it/codici-trasparenti-per-spiare-vite-altrui/ on 2017-12-24
Copyright © 2024 il nuovo manifesto società coop. editrice. All rights reserved.