If you start entering the name of Alex Jones in the Google search tool, it suggests:
Alex Jones, radio host and gives you a mini-picture so you can see if this is the person you’re looking for.
The radio host qualifier for the far-right American, founder of the website Infowarswho became famous for his conspiracy theories about the Sandy Hook shootings and the September 11 attacks, might surprise you.
Do the same exercise with Gavin McInnes, founder of the far-right group Proud Boys, and it is presented to you as a writer. For Jake Angeli, says QAnon Shamanone of the participants in the assault on the United States Capitol in 2021, the search engine uses the qualifier activist.
Discussions about platforms that help spread misinformation online regularly target Facebook and Twitter while Google’s search engine is too often overlooked, argues Ahmed Al-Rawi, a professor at Simon University’s School of Communications Fraser (SFU) in British Columbia.
However, these descriptions given by the automatic suggestion tool of the Google search engine omit part of the truth and embellish the vision of these individuals with the general public, explains Ahmed Al-Rawi who is also director of the laboratory. The Disinformation Project from SFU.
” Calling someone an activist when that person has spread hatred and even called for genocide is not normal. »
Activists, journalists, and many more
As part of this study, Ahmed Al-Rawi searched the titles suggested by Google’s search engine to qualify 37 people, considered to be conspiracy theorists or to have endorsed conspiracy theories, along with other researchers from the Disinformation Lab. .
Among the 30 who had a subtitle, none of these qualifiers reflected the public’s view of these individuals, according to Ahmed Al-Rawi: 16 were represented for their contribution in the artistic field, 4 were qualified as activists, 7 associated with their original jobs, 2 related to the journalistic field, one to his sports career and a last identified as a researcher, detail the results of the study published in the M/C Journal.
Knowing that Google’s search engine suggestions can yield different results depending on the geographic search area, researchers were surprised to find that the same results were obtained in Canada, the United States and the Netherlands. . They came to this conclusion using a virtual private network (VPN) that originated in these three countries.
A bias impossible to define without a known algorithm
An algorithm uses a series of rules and a defined database to arrive at a result. If its initial database is biased, these inequities will also be carried through to the given result.
In this case, are the qualifiers assigned to conspiracy theorists due to an error in Google’s database or a conscious choice? Difficult to say, given the little information available on the operation of its algorithm, argues Stéphane Couture, professor of communication at the University of Montreal, co-responsible for the Laboratory on online rights and alternative technologies.
However, it is clear that the choice of these titles was not made within the framework of an editorial policy, he underlines.
Google doesn’t have an editor who decided to put that radio host subtitle to Alex Jonesexplains Stéphane Couture.
Ahmed Al-Rawi, for his part, fears that conspiratorial groups could take advantage of this system.
If Google’s database is based on information available on the Internet, as the platform stipulates, these conspiracy theorists could influence the way they are presented by the search tool, thanks to the way they define themselves. same on the Internet, maintains the director of the laboratory The Disinformation Project.
” The system is manipulated, it’s like a loophole that lets conspiracy theorists promote themselves with the help of Google, »
A limited vision
Google relies on the neutrality of its research process to ensure the support of as many Internet users as possible, explains Stéphane Couture.
Although this neutrality is only claimed and these subtitles give a limited vision of these individuals, the fact remains that they are not false, supports the professor in communication. Alex Jones is indeed a radio host, he recalls.
Of course, if the platform presents Alex Jones as a conspiracy the people who are in the camp of Alex Jones will be angry with Google.
” This information seems biased to us, but it is not for Alex Jones. »
For Stéphane Couture, it is rather in the choice of giving or not a subtitle to these individuals that Google takes a position. Other controversial figures, such as Osama Bin Laden, do not have it, for example.
Control and transparency required
This is not the first time that Google’s search engine algorithms have been discussed, the two researchers recall.
Ahmed Al-Rawi believes that enough pressure from the international community could prompt the digital giant to intervene to resolve the situation.
Google has already made changes to its search algorithm, after an outcry over terms with a pejorative tendency to qualify women or racialized groups on the search tool, recalls Ahmed Al-Rawi.
In 2020, Google also decided to remove gendered labels, such as
womenof its algorithm to comply with its ethics rules on artificial intelligence.
Stéphane Couture says Google should be more transparent about how its algorithms work. He suggests that the platform withdraw these titles from its suggestions and adopt a
chief editor who could be held responsible when questions of algorithmic bias arise.
Their impact is very real according to him, given the use that the general public makes of the platform.
” It’s like saying, ”Osama bin Laden was a former citizen of Saudi Arabia”. It completely erases its history and in a strange way the political dimension behind it. »
Internet giants like Google justify these biases in their algorithms by saying that they are the
mirror of societybut for more and more researchers and politicians, these platforms have
an editorial role to play, says Stéphane Couture.
With information from Nantou Soumahoro
We want to say thanks to the writer of this article for this amazing web content
Google offers watered-down description of controversial characters, study finds
We have our social media profiles here , as well as other pages on related topics here.https://www.ai-magazine.com/related-pages/