ES / EN
- May 11, 2025 -
No Result
View All Result
OnCubaNews
  • World
  • Cuba
  • Cuba-USA
  • Opinion
    • Columns
    • Infographic
  • Culture
    • Billboard
  • Sports
  • Styles / Trends
  • Media
  • Special
  • Cuban Flavors
  • World
  • Cuba
  • Cuba-USA
  • Opinion
    • Columns
    • Infographic
  • Culture
    • Billboard
  • Sports
  • Styles / Trends
  • Media
  • Special
  • Cuban Flavors
OnCubaNews
ES / EN
Home Opinion Columns Our Life

The history of algorithm. The “failures” of Artificial Intelligence

AI is only as good as the data it processes. A poorly designed algorithm diffuses bias at scale. Accepting the data to be processed by the AI, without subjecting them to critical scrutiny, is a dream of reason that endlessly produces monsters.

by
  • Julio César Guanche
    Julio César Guanche
May 16, 2023
in Our Life
0
Petition to the IA: realistic portrait of the Cuban independence heroes of 1868 and 1895, including Carlos Manuel de Céspedes, José Martí, Antonio Maceo and Guillermón Moncada.

Petition to the IA: realistic portrait of the Cuban independence heroes of 1868 and 1895, including Carlos Manuel de Céspedes, José Martí, Antonio Maceo and Guillermón Moncada.

In 2021 an African American man was arrested in Michigan and handcuffed outside his home in front of his family. The arrest warrant was generated by an Artificial Intelligence (AI) system, which identified the subject as the perpetrator of a theft. The AI had been trained mostly on white faces and completely made a mistake when identifying the offender. It was probably the first wrongful arrest of its kind.

That same year, 26,000 families were accused of fraud in the Netherlands. The fact in common between them was having some migrant origin. The event led to the ruin of thousands of innocent people, who lost homes and jobs, forced to return money from social assistance.

It was an error that generated an “unprecedented injustice” in that country. The cabinet resigned when faced with the scandal. The diagnosis of the alleged fraud was made by an AI.

AI: dataist utopias and dystopias

Novel as AI is, the place of mathematics in processing social issues is not new.

Philosophy brims with dataist utopias. For Tomás Moro, the establishment of a new method of government should be based on a tool that guaranteed excellence in business administration: mathematics. With AI, Saint-Simon’s utopia of “the good administration of things and the good government of people” promises a renewed opportunity through algorithms processed by machines, not for nothing called “computers.”

Related Posts

Photo: Kaloian.

“Civil society” or “people” in Cuba? A matter that matters

July 9, 2023
Photo: Kaloian.

The ball is round… Baseball, politics and the nation

March 21, 2023
Kharkiv. Ukraine. Taken from Facebook.

U-kraine: to the limit

February 28, 2022
Is J11 a case of sedition?

The problem of sedition

February 4, 2022

AI involves the interaction between software that learns and adapts, hardware with massive computing power, and vast amounts of data. It has been defined as “a constellation of processes and technologies that enable computers to complement or replace specific tasks that would otherwise be performed by humans, such as decision making and problem-solving.”

Its advantages are many: informed decision-making, massive information management, struggle against the climate crisis, restoration of ecosystems and habitats, retardation of biodiversity loss, efficient allocation of social resources, improvement of humanitarian aid and social assistance, diagnoses and health applications, control of traffic flows, etc.

Now, the different political connotations of the uses of mathematics have been noticed in many ways. Engels told Marx in 1881: “Yesterday, at last, I found the strength to study your mathematical manuscripts and, although I did not use supporting books, I was glad to see that I did not need them. I congratulate you on your work. The matter is as clear as daylight, so it never ceases to surprise me how mathematicians insist on mythologizing it. It must be because of their partisan way of thinking.”

Karl Popper, the author of The Open Society and Its Enemies, considered the “bible of Western democracies” by Bertrand Russell — himself a mathematician — began his career as a professor of mathematics and physics.

The Leviathan, by Thomas Hobbes, a completely anti-republican political program, said that a good government comes from modeling on a machine: “this great Leviathan called REPUBLIC or STATE is nothing more than an artificial man, although of stature and strength superior to those of the natural man.”

The AI, that “artificial man,” promises to be neutral, but is often partial: thus it produces an “algorithmic leviathan.” Frequently, it operates with a black box: the information supplied to the algorithm is known, but the process followed by it to achieve a certain result is unknown. Under these conditions, if discrimination exists, it is unknown whether it occurred on the basis of sex, ethnicity, skin color, age, religion, ideology, or another dimension.

Without black boxes — in some countries they are being subjected to legal regulation — it would be possible to identify how an algorithm discriminates. In general, this is because the information on which the algorithms are trained is partial, or because they reproduce pre-existing discriminatory biases. It is not possible to ignore that the historical structure of the technology industry is made up mainly of white men, from class strata and cultural frameworks that are quite homogeneous among themselves.

However, discrimination can also be intentional. Hate, division and lies are good for business: they multiply the exchanges to be monetized. In this field, the production of discrimination can be hidden under business secrecy.

Algorithmic racism

The notion of race occupies a central role in algorithmic discrimination.

This centrality is reflected in the Recommendation on the Ethics of Artificial Intelligence, the first global instrument on the subject, adopted in November 2021 by 193 UNESCO Member States. The document seeks, among other objectives, to ensure equity and non-discrimination in the implementation of AI, seeking to prevent existing social inequalities from being perpetuated, and to protect vulnerable groups.

There is no scientific way to justify the existence of human “races.” All individuals of the human race have 99.99% identical genes and DNA. The traits that determine the physical appearance of people comprise only 0.01% of the genetic material. The concept of race is a result of racism, not its origin.

The AI covers its behavior in matters of race with a cloak of science — it assures that it is an invisible variable — but it often operates on pseudoscientific foundations.

The first formal use of the term “pseudoscience” is recorded in 1824, to describe phrenology. Facial recognition systems that claim to predict dangerousness, characteristics or personality from photographs reproduce the logic of this pseudoscience.

An expression of scientific racism, as were craniometry, racial demography, and criminal anthropology, phrenology asserted the possibility of determining character and personality traits, and criminal tendencies, based on the shape of the skull, head, and the features. It has had no scientific validity for a long time.

However, according to Achille Mbembe, the leading Cameroonian philosopher, “new security devices [such as facial recognition using AI] take into account elements of the past in previous regimens: disciplinary and penal regimens of slavery, elements of colonial conquest and occupation wars, juridical-legal techniques of exception.”

There is bulk evidence of this. The COMPAS system, used in the United States to predict recidivism, has been questioned because African American defendants are twice as likely to be mischaracterized by the system. One same resume is 50% more likely to pass a job interview if the candidate’s name is identified by the algorithm as European American than as African American.

Joy Adowaa Buolamwini, a computer scientist at the Massachusetts Institute of Technology (MIT), to whom Netflix dedicated the documentary Coded Bias, has evaluated various facial recognition systems from cutting-edge companies in the field. Her conclusion was that recognition error rates for lighter-skinned men were no more than 1%. At the same time, she found that when dealing with darker-skinned women, the errors reached 35%.

Leading facial recognition systems have been unable to recognize the faces of Michelle Obama, Oprah Winfrey and Serena Williams. Twitter was unable to identify Barack Obama. A rapper built with Artificial Intelligence was “fired” for reproducing racist stereotypes. Tay, an AI designed to have “cool” interactions with its users, in less than 24 hours went from saying that humans were “super cool” to saying that “Hitler did nothing wrong.” People of black skin color may not receive home sales ads on Facebook.

Is AI racist?

Let ChatGpt3 answer the question:

ChatGpt3’s response translated:

Yes, AI (artificial intelligence) can be racist if it has been trained with data containing prejudices or racial discrimination. AI learns through the processing of large amounts of data and, if these data contain prejudices or discrimination, the AI can replicate these same prejudices and discrimination in its decisions and results.

For example, if an automatic learning algorithm is trained with data that underrepresent hundreds of racial groups, it is possible that the AI will learn to make biased and discriminatory predictions. In a similar way, if the group of data used to train AI contains racial stereotypes, it is possible that the AI will reproduce those same stereotypes.

It is important to point out that AI has no intentions or emotions and cannot be consciously racist as people are. However, AI’s biased decisions and results can have discriminatory consequences in real life, which makes it crucial to approach the problem of the bias and discrimination in the automatic learning algorithms and improve the quality of the data used for its training.

AI is only as good as the data it processes. A poorly designed algorithm diffuses bias at scale. Accepting the data to be processed by the AI, without subjecting them to critical scrutiny, is a dream of reason that endlessly produces monsters.

Police data series, for example, respond to information bases that have been built, in many cases, with partial data, with legal practices that later ceased to be legal, or with methods that have since been illegal, in common contexts of police racism.

Racism is a structural, social and cultural legacy that is reconstructed at the same time. Racism is reinterpreted, evolves and reproduces. It is not enough with a criterion of justice such as “non-discrimination,” when processing the data, which keeps the names involved anonymous, and makes racial data invisible, which understands justice in the way of “treating everyone as equals.”

A known case in the United States showed the inability of this criterion to produce fair results: an AI system, to grant bank loans, omitted names and any data that could refer to skin color. However, the result produced markedly racist results.

The investigation showed that the request for the zip code of the home of each subject involved in the investigation reintroduced racism, although it had been intended to expel the race mark from the data collected. The zip code of areas identified as having majority African American populations was disadvantaged compared to neighborhoods whose zip code was identified by the algorithm as mostly white.

The present is its history: what is taken out the door, comes back through the window. Superseding history requires making it visible, not the other way around.

Petition to the IA: realistic portrait of the Cuban independence heroes of 1868 and 1895, including Carlos Manuel de Céspedes, José Martí, Antonio Maceo and Guillermón Moncada.

Technological solutionism is not the solution

The solutions offered by machines have an aura of ideological neutrality, technological efficiency, and encrypt new capacities to face old problems. AI is presented as technological management of the organization of common affairs. It is easy to frame it within the non-partisan ideology of “technological solutionism.”

However, for Cathy O’Neil, a U.S. mathematician and activist, algorithms are “opinions locked up in mathematics.” Without committing to race-awareness statistics, without making the data take into account the socioeconomic differences of population groups compared to other groups, without guaranteeing participation, control and transparency in the collection and use of data, the algorithm loses much of its technological fascination, and reveals, rather primitively, the political nature of the context in which it operates.

Race does not exist, but racism exists. AI is not racist per se, but it produces racist results. Without taking charge of history, the algorithm is an opinion that encodes the exclusion and programs the discrimination dominant in the history inscribed in its data.

  • Julio César Guanche
    Julio César Guanche
Tags: Artificial Intelligence (AI)racism
Previous Post

Cuba among ten countries with highest coverage with COVID-19 vaccines

Next Post

Inside and outside: “The government’s policy is clear towards rapprochement with Cubans residing abroad”

Julio César Guanche

Julio César Guanche

Profesor e investigador. Ha escrito varios libros y un número largo de ensayos y artículos. Hubiera querido ser trompetista, pero la vida es como es. Siente la misma pasión por el cine, la historia, la música y la cultura popular. Descree, en profundidad, de quien no sepa cocinar. Investiga temas de política, historia y derecho, pues cada cual se divierte como puede.

Next Post
José Martí International Airport. Photo: Kaloian.

Inside and outside: “The government’s policy is clear towards rapprochement with Cubans residing abroad”

A Venezuelan migrant and her son rest in a temporary shelter after crossing the Darien jungle on their way to the United States, on May 11, 2023, in Los Planes de Gualaca, Panama. While thousands of migrants continue their journey to the United States in dozens of buses from a shelter in northern Panama, a few make the reverse route to return to their countries, fed up with continuous abuse. Photo: EFE/Bienvenido Velasco.

Migrant mothers: the cross-border territory of care

Photo: Claudio Peláez Sordo.

Cuban Conga against Homophobia and Transphobia, five years later

Leave a Reply Cancel reply

The conversation here is moderated according to OnCuba News discussion guidelines. Please read the Comment Policy before joining the discussion.

Your email address will not be published. Required fields are marked *

Most Read

  • The Enchanted Shrimp of the Cuban Dance

    2939 shares
    Share 1176 Tweet 735
  • Cuban Cardinal before the conclave: “There is a desire to maintain the legacy of Pope Francis”

    34 shares
    Share 14 Tweet 9
  • Deported and without her baby daughter: Heidy Sánchez’s desperation

    10 shares
    Share 4 Tweet 3
  • Cuban economy, the “regulations” and the shoe

    9 shares
    Share 4 Tweet 2
  • Melagenina Plus, Cuba’s hope against vitiligo, being tested

    132 shares
    Share 53 Tweet 33

Most Commented

  • Photovoltaic solar park in Cuba. Photo: Taken from the Facebook profile of the Electricity Conglomerate (UNE).

    Solar parks vs. blackouts: between illusions and reality (I)

    15 shares
    Share 6 Tweet 4
  • Fernando Pérez, a traveler

    11 shares
    Share 4 Tweet 3
  • Solar parks vs. blackouts: between illusions and reality (II and end)

    13 shares
    Share 5 Tweet 3
  • The “Pan de La Habana” has arrived

    31 shares
    Share 12 Tweet 8
  • China positions itself as Cuba’s main medical supplier after signing new contracts

    27 shares
    Share 11 Tweet 7
  • About us
  • Work with OnCuba
  • Terms of use
  • Privacy Policy
  • Moderation policy for comments
  • Contact us
  • Advertisement offers

OnCuba and the OnCuba logo are registered® trademarks of Fuego Enterprises, Inc., its subsidiaries or divisions.
OnCuba © by Fuego Enterprises, Inc. All Rights Reserved.

No Result
View All Result
  • World
  • Cuba
  • Cuba-USA
  • Opinion
    • Columns
    • Infographic
  • Culture
    • Billboard
  • Sports
  • Styles / Trends
  • Media
  • Special
  • Cuban Flavors

OnCuba and the OnCuba logo are registered® trademarks of Fuego Enterprises, Inc., its subsidiaries or divisions.
OnCuba © by Fuego Enterprises, Inc. All Rights Reserved.

Manage Consent
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes. The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.
Manage options Manage services Manage {vendor_count} vendors Read more about these purposes
View preferences
{title} {title} {title}