archiveauthordonateinfoloadingloginsearchstarsubscribe

Commentary. The point is that the ‘algorithmic black box’ is a direct function of giant economic interests. If AI is now capable of ‘brain reading,’ it is clear that we are entering another era of the anthropocene.

Digital capitalism, information and democracy

The Senate Anti-Discrimination Commission resumes its work on Tuesday. During the last legislature, this body, chaired by Liliana Segre, did important work on a number of issues.

It did significant work on the freedom of expression, hate speech on social media and the accountability of online platforms. The resumption of its activities will take place in a challenging context: after a unanimous vote on the Commission, the right procrastinated for months on its actual establishment. Moreover, at the end of 2022, a severe crisis hit the social media universe, with thousands of layoffs among workers and managers from Meta to Twitter, but also at global players such as Amazon and Uber – a crisis that the new Commission cannot fail to examine.

In the summer of 2022, the European Parliament had approved both the new Digital Services Act (DSA) and the Digital Markets Act (DMA), imposing strict rules on the big platforms. Yet, “the rules that the European Commission proposed a year ago and that the Parliament is examining are already obsolete to an extraordinary degree” (Riccardo Luna, La Stampa). Recently, the Italian Antitrust Authority opened an investigation into Meta, particularly regarding the ChatGPT issue, that is, the “trawling” collection of personal data necessary for the insatiable algorithm that gets fed on prejudice or bias against women and minorities. The Antitrust Authority considers it “particularly urgent” to revise the AI-Act, the European Regulation on Artificial Intelligence (particularly generative intelligence). This need is also being recognized in Germany, France, Canada and Australia.

Nor is it just a matter of rights and protections because the economic implications remain decisive. For Kate Crawford, among the leading scholars of artificial intelligence, this is not a matter of pure logic or computational technique but involves “the systemic forces of unfettered neoliberalism, austerity politics, racial inequality, and widespread labor exploitation.” According to none other than Elon Musk, AI can pose “profound risks to society and humanity.”

Recently, the resignation of “Godfather of AI” Geoffrey Hinton from Google to speak out about the serious dangers associated with AI development caused a stir. As Teresa Numerico wrote in il manifesto, “the malicious use of these tools is a certainty rather than a possibility” because the malicious use of artificial intelligence is more than just a risk. Hence the need for “regulation of these technologies.”

Certainly, Parliament needs to be aware and pay attention, and the Segre Commission needs to do strategic work, including by means of a fact-finding investigation on AI and algorithms. This is a problem with implications on democracy: “Politics is in the vortex of the algorithm,” or, otherwise put, “artificial intelligence is manipulating politics” (Massimiliano Panarari, La Stampa), while on the international geopolitical scene, “social media wars” are being waged such as the U.S.-China confrontation over TikTok (considered an element favoring espionage and data leaks in favor of Beijing, etc.).

The point is that the “algorithmic black box” is a direct function of giant economic interests. If each of us is “profiled,” completely uncovered in our possessions, in the way we present ourselves and our true precariousness, our solvency and reliability, our orientations and passions, if AI is now capable of “brain reading,” it is clear that we are entering another era of the anthropocene.

In late March, the journal International Politics and Society published an article on the digital gender divide, which argues that the new digital frontier particularly penalizes women and fosters “sexist hate, gender-trolling and gendered disinformation.” No doubt, hate speech and the economic interests of big platforms are closely connected in the times of “the data economy dominated by Big Tech” and “digital capitalism.”

Byung-Chul Han, a theorist and critic of infocracy, wrote that when information “is produced in private spaces and distributed to private spaces,” and thus “is spread without forming a public sphere,” this has “highly deleterious consequences for the democratic process,” and any possibility of transformation is lost: “today, revolution is no longer possible.” Thus, there are issues of rights, democracy, and economic and financial power at play. According to Byung, “democracy is under threat.” According to Crawford, dealing with algorithms and AI means dealing with data protection, workers’ rights and racial inequality. In short, the new Commission has its work cut out for it.

Subscribe to our newsletter

Your weekly briefing of progressive news.

You have Successfully Subscribed!