Researchers look into racist and colonialist artificial intelligence – L’Hebdo du St-Maurice

Patrice Bergeron, The Canadian Press

QUEBEC — Even artificial intelligence (AI) can be racist and colonialist.

Academics are currently looking into these debatable questions.

For example, algorithms generate hurtful associations towards black people. “It’s absolutely terrible, it’s colonization and racism,” said Professor Karine Gentelet, of the University of Quebec in Outaouais (UQO), in an interview with The Canadian Press published on Saturday.

This sociologist and anthropologist specializing in these issues takes a very critical look at the spectacular progress of artificial intelligence.

She will take part on Tuesday in a conference by an observatory at Laval University, which will look at the decolonization of artificial intelligence, ethics and the rule of law.

Another example: “Indigenous peoples speak of recolonization through digital technology”, illustrated the researcher who also works with indigenous communities.

We are therefore far from the idea of ​​a “neutral” technology and of scientific progress which is only beneficial.

According to these currents, artificial intelligence perpetuates the colonial domination of the West over Indigenous peoples, countries in Africa, Asia and Oceania, etc.

“There are power relations because these are technologies that are heavily funded and often developed in (the countries of) the North, then implemented in the South,” explained Ms. Gentelet.

In addition, in these tools there is a “representation of what the human person is and how it interacts in society” which does not necessarily correspond to people who are not from the majority group, a- she continued.

“The representation that we have of racialized people in northern societies is not adequate in relation to their contribution in society.”

How does this translate into reality? Let’s take the example of databases which, in the field of health, are used in particular to design new drugs or document health problems.

However, there are marginalized groups who do not go to consult doctors and therefore do not appear in the databases, said the researcher.

“There is a degree of pre-existing inequality already in the data, which does not reflect the composition of the population,” she argued.

Communities that struggle to break “social invisibility” thus find themselves still excluded because we rely on data in which they do not appear.

Another example: the professor mentions software that is tested by Immigration Canada on people who are refugee claimants or in the process of immigration.

“The accountability process is much more difficult for them because if they want to complain, where do they go? In Canada where they are not citizens?”

This field of research is still relatively new and controversial. There is not unanimity, but there is more and more research, said Ms. Gentelet.

“Decolonization, in general, is something that is controversial in Canada. There are people who have opposing views.”

Several solutions are possible, whether it be a stricter regulatory framework, better accountability and investments, she listed.

However, this controversial issue must be tackled quickly, she insists. Because if we perceive artificial intelligence favorably because it solves problems, “there are underlying conditions for the resolution of these problems which will not be fair”.

In his eyes, “it participates in a recolonization because technology is a reflection of society”.

By artificial intelligence we mean technologies that allow applications or computers to imitate or assist human intelligence.

Think of algorithms, facial recognition technologies, autonomous vehicles, etc.

We would love to say thanks to the writer of this article for this outstanding content

Researchers look into racist and colonialist artificial intelligence – L’Hebdo du St-Maurice


Find here our social media profiles as well as other pages that are related to them.https://www.ai-magazine.com/related-pages/