In her book Weapons of Math Destruction (Allen Lane, Penguin books), she narrates from the insider’s point of view the harmful effects of the uncontrolled mathematical procedures, including algorithms, to process data for the purpose of decision-making on issues of social nature.
The examples range from students choosing university to accept the job search, from the pace of work to the tools used by the police to predict crime, the mechanisms for evaluating teachers in schools to the evaluation of insurance premiums, the granting a mortgage, and all financial transactions and information searches on the Net. All activities governed and processed by automatic mechanisms based on the interpretation of large amounts of data derived from traces of the behavior of individuals.
But the algorithms are guided only by the efficiency and the maximization of profits and no one programs them to ensure fairness or justice of their results. So, the tools to predict crimes used by the police tend to send the cops to the worst neighborhoods where people are poorest and where the presence of law enforcement will only reinforce the notion to commit more crimes. The problem raised by O’Neil is the ‘categorization’ adopted by the analysis tools tend to perpetuate officially decommissioned prejudices such as racial prejudices and consider poverty as a disease to be contained to avoid its spread.
The construction of the practices of contempt and distrust happens due to the implementation of techniques wrongly considered objective and neutral. Why do the selection tests for jobs in fast food exclude people who develop signs of emotional instability? The important thing seems to be only to avoid hiring people who could create relationship problems. The tests do not select excellence, these psychological tests are administered automatically to save money, they only work to exclude potential troublemakers. But isn’t this a discriminatory practice?
Or the credit e-score, that underpin a series of assessments to determine the solvency of potential customers are often based on incorrect information, or otherwise on the establishment of correlations between elements that have nothing to do with the ability of the individual to pay. It does not matter if a certain person is more or less solvent, but if the people who belong to the same category are. The behavior of the individual is crushed by the behavior of the group to which the individual is affiliated by obscure similarity mechanisms built on discriminatory classifications, such as geolocation.
The technology produces results starting from the goals that motivate its programmers. Here the comparison with DeLillo’s underground vault of undead becomes pressing, because the algorithms take the opportunity to measure everything, to crush the future on the past and the behavior of the individual to the behavior of the category, and to model the continuity and changes in a horizon of repetition free of every unpredictability.
This subject, though in a more philosophical approach, is at the center of the book by Wendy Chun, Updating to remain the same (MIT Press). It would be great to have this book translated into Italian. It faces several hot topics around the net, social networks, the subjectivity created by Big Data and the ability to live the own fragilities in the online public space without being nailed, keeping the option of an incomplete adherence to our actions. Wendy Chun opposes the principle Consent once, circulate forever, consenting once means allowing a permanent circulation of one’s choice.
The author focuses on the philosophical analysis of the concept of habit. Habits are incorporated, they do not cause dependency, and may be changed. We must not and we cannot be crushed by our habits as if they were eternal. It is not right that the Big Data will rely on repetition and a conception of memory understood as a warehouse. The right to oblivion is not just about a legal protection to be put forward at a search engine. It means thinking about a permanent and anticipated forgiveness (in English forgive, can also be understood as a given in advance that applies to the future). Erasing the digital memory does not mean forget, but can make a non hallucinatory memory that does not inexorably reduce us to the acts in our past.
Chun analyzes the network as something which, by its nature, makes us lose control over our information, making it circulate unduly. However it is not necessary that said loss is exploited by the institution of the public-private surveillance, as has occurred and has been shown by Edward Snowden. Politics decide how to use or how to abuse this loss and nail individuals to their past activities and accuse them of being terrorists, or despise them as “shameless whores” because they got naked in a video chat.