Africa facing artificial intelligence and trans-humanism

“A life is a life; one life is not older or more respectable than another life, just as another life is not superior to another life. »

Charter of Mandé, article 1 of the declaration of human rights, Kingdom of Mali, 1236

According to the professor of cybernetics, Kevin Warwick, “in the same way that, in the distant past, humans separated from their chimpanzee cousins, in the coming times the augmented ones will separate from simple humans who will no longer represent, compared to them, only the chimpanzees of the future, doomed to disappear, or to stagnate in the reserves that the posthumans will perhaps agree to spare for them. Such, then, would be the alternative: increase or end up in the zoo. » . This assertion, no doubt outrageous, is however far from being just verbiage, because 1 comes from an eminent professor at the University of Coventry crowned with several scientific distinctions. At the age of 40 he obtained the prestigious honor of higher doctorate from Imperial College London and the Czech Academy of Sciences. From the IET (Institution of Engineering and Technology), he received the Achievement Medal and the Mountbatten Medal. He also received the Ellison-Cliffe Medal from the Royal Society of Medicine. Kevin Warwick prides himself on being the world’s first cyborg (cybernetic organism). He actually augmented himself by having an RFID (Radio-frequency identification) chip implanted under his skin, which allows him to perform a number of tasks, including opening doors remotely. Question: Which continent will have the highest concentration of “Warwick chimpanzees”? We will come back to it.

Man’s will to increase: engine of progress?

The will to increase is inherent in the evolution of Man since he reached the status of hominid by the production of knowledge, Man being par excellence the animal whose imagination largely transcends the capacities physical and even mental. So Professor Warwick’s action is part of a new stage in a process that began in the Paleolithic era in Africa, more than 3 million years ago, when the ancestor of Man s began to carve stone to make a tool. A way to extend his hand. Then, 10 thousand 2 years ago, with the birth of agriculture, Man had the idea of ​​increasing himself by domesticating his animal cousins, more powerful but less intelligent than him, to carry out field work.

As far as memory is concerned, from the third millennium before our era, conscious of its fallibility, Man had the inspiration – quite simply brilliant for the time – to record his thoughts, by writing, on a support material. On clay tablets, papyrus and even bones for example. With the creation of writing in Mesopotamia, Egypt and then China, Humanity made a decisive turning point in the production, dissemination, conservation and accumulation of knowledge. This tool for extending human memory, the main safe and vehicle of knowledge until the arrival of audiovisual and TIC, had, moreover, given rise to the emergence and development of other sciences. For example, utilitarian mathematics (economics, elementary geometry) appeared almost at the same time as writing. In fact, the evolution of Man is so marked by writing that the appearance of the latter is retained as the border between Prehistory and the History of Humanity.

Man was not only concerned with the world visible to the naked eye. Lenses were discovered in Egypt, which date from the third millennium before our era, which undoubtedly means that the human being was already seeking to increase himself, in order to discover other universes too distant to be visible: the cosmos; or neighbors but invisible, that he rubs shoulders with or shelters in his organism: the infinitely small. Today, we are at the Titan Krios electron microscope installed by the Pasteur Institute in Paris in 2018, and presented as the most efficient in the world. It makes it possible to make observations on the scale of a tenth of a nanometer. It is the atomic scale.

The process of human augmentation continued until the end of the 18th century when a pre-singularity, coming from England, occurred with the first industrial revolution. The latter is marked by the advent of the modern steam engine, which made it possible to move from artisanal production to industrial production. Mass production at a rate beyond the material capacities of man, which had profound socio-economic repercussions. A century later came the second industrial revolution, driven from Germany and the East Coast of the United States, and whose main characteristics are the exploitation of oil, the discovery of electricity, the development of the automobile industry , and the appearance of the telex and the cordless telephone. Man thus crosses a new stage in his process of increase by giving himself the means, by automobile, to move at speeds that his physiology does not allow him; and, by telephone, to be heard while physically absent.

However, it was following the Second World War that the human species, after having grown for more than three thousand millennia on the mechanical, memorial, optical and kinematic levels, reached a new level of this process of outsourcing tasks that his nature allows him to think about, to imagine, without having the material and mental capacities to carry them out. In 1948 was born the first computer (Von Neumann architecture machine) equipped with an algorithm allowing it to perform arithmetic and logical operations. It was therefore this year that a new dimension of the process of increasing human beings appeared: intellectual increase. It was year one of artificial intelligence (AI), in the first sense of the term.

Then, towards the end of the 1970s, access to the computer was democratized both in the business environment and in households; and this thanks once again to the natural will of Man to increase himself to realize the fruits of his imagination that his nature does not allow him to implement. In 1978, the trigger came from Dan Bricklin, a young American computer scientist who enrolled in an MBA at Harvard Business School. Bricklin had noticed the forbidding side of the work of one of his professors, who established tables of interdependent values ​​such that, as soon as a single value is modified, several others are. He then took on the task of developing an application that could be used on a computer, in order to automate this type of series of operations. In record time, the young Bricklin developed the Visicalc software, the first spreadsheet in the history of computing, the ancestor of the famous Excel. VisiCalc being developed for the Apple II computer – a machine timidly marketed since 1977 – and making it possible to reach unimaginable calculation speeds for the human brain, the VisiCalc-Apple II couple enjoyed resounding commercial success from 1979. C was the beginning of the democratization of the use of the microcomputer; and the starting point of an unprecedented ferment in the production of algorithms of all kinds, and the rise of Information and Communication Technologies (ICT). These new advances in science and technology founded what is commonly called the third industrial revolution.

By Pr Abdou SENE, Applied Mathematics Specialist

We would love to give thanks to the writer of this short article for this outstanding content

Africa facing artificial intelligence and trans-humanism


Discover our social media profiles and the other related pageshttps://www.ai-magazine.com/related-pages/