Facebook prefers to temporarily censor scientific information from very credible sources rather than tolerate harmful publications for too long, while seeking to improve its algorithms that cannot detect sarcasm.
Agence Science-Presse is a non-profit media that has existed for 43 years. She notably gave birth to the magazine and the TV show The Resourceful.
The Agency published on its Facebook account at the end of March the article Arguments and Strategies of the Anti-Vaccine Movement.
The article, however, was removed on March 28 “for failing to meet community standards” and the page was “restricted for 90 days for repeatedly sharing false information,” the posts Facebook said. placed on the Agency page.
She then stepped up her efforts to correct the situation, which took more than three weeks (see table).
“It’s frustrating at the time, but it’s so absurd that we try to laugh about it in the end”, denounces Pascal Lapointe, editor-in-chief of the Agency.
“With a newspaper, a TV station or a radio, we would have contacted the editor and solved the problem in 15 minutes. But here we are in the most total fog. We don’t know who to complain to, we can’t talk to a manager. We press a button and we do not know if it goes somewhere. And when in the end the problem is solved, we do not know what they based themselves on, ”he adds.
On the functioning of Facebook, Mr. Lapointe sees only two possibilities.
“Somewhere in the algorithm, after receiving a number of complaints, the robot reacts and blocks the site. I would not be surprised if anti-vaccines pressed a button claiming that we publish false news, ”he thinks.
The other possibility according to him is that the algorithm analyzes certain words.
“The article mentioned that for a century, it is always the same arguments that are mentioned by anti-vaccines. It said for example ‘the first falsehood: vaccines are dangerous’. Did the robot automatically qualify this as fake news?” he wonders.
A non-profit organization, the Agency did not lose any money “but it had an impact on traffic. A lot of small sites like us are too dependent on Facebook. It’s boring, we would like not to be, ”says Mr. Lapointe.
The example of Agence Science-Presse is not unique in the science popularization community.
“We published a video on natural immunity, with references to scientific publications. Facebook blocked the post and gave us a warning. At the next “infraction” we will lose the page”, deplores the Dr Mathieu Nadeau-Vallee.
Facebook’s parent company says it understands the “irony” of seeing a site dedicated to fighting misinformation being blocked for this reason.
“We are sorry this happened. Our systems are not perfect, we know that. We continue to work on it,” explained a spokesperson for Meta.
“Meta prefers to remove a publication that is not harmful and reinstate it two days or two weeks later, rather than leaving a (harmful) one for too long,” he added, explaining that the multinational must manage three billion users and a hundred languages.
The explanation leaves the editor of the Agency skeptical, “considering the amount of fake news accounts that have been denounced over the years around the world and have remained online,” says Pascal Lapointe.
The Dr Vadeboncoeur recently had four of his posts censored by Facebook. “The first two were slightly ironic. It would be interesting to know how they suspend people who always have a speech that could not be more straight in the direction of science, ”says the Dr Vadeboncoeur.
“The challenge for artificial intelligence is often to be able to detect sarcasm. The algorithms are not yet strong enough to understand that the Dr Vadeboncoeur uses humor or sarcasm. That’s why the human comes to the second level of verification, “replied the spokesperson for Meta.
The Dr Vadeboncoeur recognizes that irony can be difficult to detect. But this loophole would have a major impact on the hunt for fake news. “It amounts to no longer being able to denounce an erroneous image or publication,” he thinks.
The first line of verification of posts on Facebook uses an algorithm that uses artificial intelligence. Disputed cases are then reviewed by an employee.
“Each of the employees has a specialty and if it affects COVID, for example, it goes in the same queue. Maybe that’s why it took longer,” Meta said, explaining the three-week delay for the Agency to resolve the issue.
Meta did not want to reveal how its first line of verification using artificial intelligence works.
“We don’t give the recipe, it would be easy to hijack it later,” she said.
No need to laugh!
Also a science popularizer, the Dr Alain Vadeboncoeur recently had four of his posts censored by Facebook.
“The two were premieres were slightly ironic. The third (in February) followed a text that compared the death rate from COVID to that of influenza, which had raised many reactions. I had made a contextualization which insisted on certain completely founded points , recalls it.
“I haven’t had access to my account for 24 to 48 hours, then I have a publication ban on other pages. My interpretation is that there were quite a few reports on this post that made Facebook’s algorithms react. It would be interesting to know how they suspend people who always have a speech that could not be more straight in the direction of science, ”says the Dr Vadeboncoeur.
For the fourth, Monday, Mr. Vadeboncoeur had clearly indicated at the top of his publication that it was irony by publishing an image giving an absurd recipe for “devaxing”.
“The challenge for artificial intelligence is often to be able to detect sarcasm. The algorithms are not yet strong enough to understand that Dr. Vadeboncoeur is humorous or sarcasm. That’s why the human comes to the second level of verification, “replied the spokesperson for Meta.
The Dr Vadeboncoeur recognizes that it is not “always easy” to detect irony. But this flaw would also have a major impact on the hunt for fake news.
“In the most recent case, I had explicitly said it was (ironically), and it didn’t change anything. It amounts to no longer being able to denounce an erroneous image or publication, ”he thinks.
And if he understands the precautionary principle mentioned by Facebook to first block publications and then analyze them, the one followed by 50,000 people believes that the multinational could act differently.
“As long as Facebook recognizes a real identity (blue verification badge) and it is a highly followed account, it seems to me that it is absurd to suspend people who spend their lives fighting against disinformation for a dash of irony or even a bad reading of the situation”, regrets Mr. Vadeboncoeur.
Pascal Lapointe of Agence Science-Presse argues for regulation of algorithms.
“This is one more example. We do not know who manages this and how it is managed. It’s opaque and abnormal that we, the Dr Vadeboncoeur, the Dr Nadeau-Vallée and many others are blocked without understanding why. These companies have achieved a power over information around the world that has no equivalent in history,” he said.
On the internet, the Agency indicates that its work in information literacy earned it in 2019 “an award from the Canadian Foundation for Journalism, a prize funded by … Facebook!”
The Dr Mathieu Nadeau-Vallée suggests another solution. “Facebook should give an official seal to certain scientific pages,” he suggests.
Simon Thibault, assistant professor of political science at the University of Montreal, would not comment directly on these cases, but he believes that they illustrate “the technical challenge of ensuring that algorithms are more vigilant.”
“Facebook, for example, wanted to be more intransigent with misinformation in the context of a pandemic. But the volume of information (to process) is quite spectacular. These platforms have been criticized, they are trying to clean up their platform and, of course, there are going to be incongruous situations like these, ”he believes.
A committee of 12 experts was created on March 30 to help the Canadian government draft legislation to combat harmful content online.
“People say that the self-regulation of these platforms does not work. That there must be an intervention of the legislations of the states to make sure that these companies really put the resources and the necessary efforts so that the real problematic contents are withdrawn”, analyzes Mr. Thibault.
3 WEEKS WAITING
March 28: Facebook pulls the news from Agence Science-Presse. The Agency contacts Facebook, which the employee cannot do anything about.
March 29: The Agency is republishing its news.
March, 31st : the news is withdrawn again.
1er april : the Agency contacts a Facebook executive, Kevin Chan, and explains the situation to him.
April 4: The leader transfers the agency to the Director of Media Partnerships.
April 8: Without news, the Agency insists with the customer service which accepts “exceptionally” to refer to its superior.
April 14: The agency again contacts Facebook and the two leaders.
April 20: The situation is restored.
We would love to give thanks to the writer of this post for this awesome content
Facebook censors scientific information
Check out our social media profiles and other related pageshttps://www.ai-magazine.com/related-pages/