Technology: the computers of tomorrow will no longer have anything to do with what we know thanks to artificial intelligence

Redesigned MacBook Air laptops are displayed during WWDC22 on June 06, 2022 in Cupertino, California.

Redesigned MacBook Air laptops are displayed during WWDC22 on June 06, 2022 in Cupertino, California.



Artificial intelligence is redefining computing. In the near future, computers could improve their performance over time thanks to AI and with deep learning.

Atlantico: Artificial intelligence is increasingly present in our computers. How did it technically reinvent what computers are? What are the main changes that these technologies bring?

Thierry Berthier: Indeed, artificial intelligence is deployed both in applications (software) and in hardware in the form of specific chips, GPU (Graphics Processing Unit) or TPU (Tensor Processing Unit) dedicated to machine learning. The integration of chips optimized to run neural networks can relieve the computational load of the central processor and handle learning applications on images, videos and natural language processing. GPU, TPU chips make it possible to “vectorize” intensive calculations and accelerate the processing of AI models. The architectures of computers and smartphones are evolving by integrating more and more chips dedicated to machine learning. These chips are also embedded in the hardware of connected objects or robotic systems to provide them with learning, classification and automatic detection capabilities. These objects are no longer content to simply be connected, they are enriched with information processing capabilities. When these chips are embedded in mobile robotic systems equipped with sensors, they make it possible to process locally, in real time (in the hardware of the drone or the robot) the flow of data from these sensors to provide autonomy in the movements of the robot and in its actions. These new capabilities are very useful in contexts where the drone or robot is no longer connected to its ground control station and no longer receives orders from its base. In this situation, the drone has to fend for itself. He must map his environment in real time, he must detect obstacles, possible pitfalls and build his own itinerary, in complete autonomy. Neuromorphic chips provide optimized computing power for this type of processing.

Read also

Why We Might Have Digital Twins In The Years To Come

Chris Bishop, director of Microsoft Research in the UK, said “For the past 40 years we have programmed computers; over the next 40 years, we will train them.” To what extent is this the reality of computer machine learning?

This is a very probable evolution of future computers which will integrate “by design” automatic learning capacities in their architectures, in their operating systems, in the management of data flows, virtualization , and in apps. The computer will learn to better know its user, his habits, his needs, his expectations. Virtual assistants will increase in power and performance by offering new functionalities. The methods for designing new applications will also evolve with a specification of the expected higher-level functionalities. This idea is not new since at the beginning of the 2000s, there were already software engineering workshops that made it possible to automate the design of simple programs. This approach to software engineering will evolve thanks to AI. Cybersecurity is already benefiting from the power of machine learning in attack detection tools. SIEM UEBA-type network monitoring solutions rely on AI to build normality models relating to the operation of a network and then to identify any deviations from this normality. This data-centric approach makes it possible to detect new, stealthy threats and attacks that are not referenced in an attack database. As in the field of robotics, AI brings autonomy to the system for detecting attacks or data theft.

Read also

World Cup: football fans around the world, Big Brother is watching you

For decades, for a computer to do something, you had to type a command, or at least click a button. This is no longer necessarily the case. How does it work?

First of all, this is not an AI-related novelty. A computer regularly performs functions in “background” mode. for example, he can optimize his memory or execute a security update completely automatically without human action, or even perform a data backup, an archiving without any click or command entered. There is generally no AI behind these functional automatisms. Your question applies more to the level of autonomy that future machines will offer by default. One can imagine that when a new computer is purchased, after it has been unpacked and started up, the machine begins by learning the habits of its user in order to optimize the services that it will subsequently offer him. Once this initial learning phase has been completed, the machine could continue learning over time by constantly improving its performance. Embedded AIs could also evolve by becoming much more generalist than the current still very vertical learning models. The cyber-protection of the machine could also evolve and improve throughout its operation. It is in this sense that the “no click” or the absence of a command entered by the user must be considered.

What could be the upcoming changes of AI on computers?

AI embedded in applications will certainly evolve in several directions. They will first of all become more frugal in their training phase: concretely, it will take less data to train a model as we do today with much more data. Reinforcement learning (RL) will make it possible to build applications capable of learning over time by observing interactions. AIs are still today very vertical, effective on a single type of task (classification or detection) but not adaptable to other types of functions. Specialized AIs will necessarily evolve towards more generalist models. AI embedded in robotic systems will transform industry, agriculture, mobility, services, medicine and logistics. Finally, the military applications of AI will transform the art of war and force us to a profound revision of doctrines.

We would like to give thanks to the writer of this post for this incredible web content

Technology: the computers of tomorrow will no longer have anything to do with what we know thanks to artificial intelligence

Check out our social media profiles , as well as other related pages