The great promises of the quantum computer

In October, the Nobel Prize in Physics was awarded to Alain Aspect, John Clauser and Anton Zeilinger. “Using groundbreaking experiments, they demonstrated the potential to investigate and control particles that are in entangled states. What happens to one particle in an entangled pair determines what happens to the other, even if they are too far apart to affect each other. The development of experimental tools by the laureates laid the foundations for a new era of quantum technology.

Accelerate the discovery of new materials

The articles devoted to this discovery have made it possible to partially understand the stakes. Quantum computing is one of them. Indeed, there are a lot of rumors about quantum computers and for good reason. It turns out that these futuristic computers are designed to mimic what happens in nature on microscopic scales. In other words, they have the power to better understand the quantum realm and accelerate the discovery of new materials, including pharmaceuticals or more environmentally friendly chemicals.

But above all, quantum computers are credited with the ability to considerably reduce calculation speeds for climate science or nuclear simulations. So, where are we in their construction?

The processing unit of quantum computers is the qubit (the quantum analogue of the classical computer bit). Different materials can be used to produce qubits, but no one is quite sure which material will prove best for building an efficient computer. “To date, there have only been small demonstrations of silicon quantum chips with high-quality qubit operations. From now on, we will have to rely on a six-qubit silicon chip that operates with low error rates,” explain researchers from the Delft University of Technology in the Netherlands. They have just published, a week before the awarding of the Nobel, a study in the prestigious journal “Nature” (1). This is considered a major step towards a fault-tolerant quantum computer using silicon.

A whole architecture to build

For the authors, in fact, “the challenge of quantum computing today consists of two parts. The first is to develop qubits of sufficient quality, reliable, and the second is to develop an architecture allowing to build large systems of qubits”. In this, the six-qubit chip constitutes a first brick of this architecture. “This breakthrough will allow testing of increasingly meaningful quantum protocols and is a major stepping stone to large-scale quantum computers,” the authors conclude.

Until then, we will have to make do with our good old computers which obey Moore’s law (see box). But after all, isn’t that enough to perform quantum calculations? In the introduction to an article published in November 2021 (2), Xavier Waintal, physicist at the CEA, explains that “the concept of a quantum computer is based on systems belonging to quantum nanoelectronics (superconductors, semiconductors), quantum optics or atomic physics.

With this computer, there is above all the promise of a very precise description of these systems with mathematical models. This model is an instance of a more general problem, called “the many-body quantum problem”, which physicists have been studying for decades”. So, without a quantum computer, how can we best exploit our powerful classical computers to advance our understanding of complex quantum systems? This is the question posed by a team from the University of California in a latest study (3), also published the day before the Nobel Prize in Physics was awarded in the journal, “Science”. “We have rigorously established that classical machine learning algorithms, informed by data collected from physical experiments, can efficiently solve certain quantum many-body problems,” explain the authors.

Thus, computers learn from training data obtained from classical simulations or quantum experiments. The device will then be able to produce a classical representation of the ground state of a physical system that it did not encounter during learning. This technique has proven itself in everyday life (when we dictate a message to our smartphone) and in medicine (diagnostic imaging or melanoma screening). Machine learning is therefore a good step until qubits are implemented on our future motherboards for better or for worse…

(1) “Universal control of a six-qubit quantum processor in silicon”, “Nature”, September 2022.
(2) “The many-body problem behind the quantum computer”, “Reflections of physics”, November 2021.
(3) “Provably efficient machine learning for quantum many-body problems”, “Science”, September 2022.


Moore’s Law, what next?

It was voiced in 1965 by Gordon Moore, then working for the US company Intel. He noted that the computing capacity of computers, imposed by the complexity of the transistors constituting them, had doubled every year, at constant cost, since the date of their invention. This exponential increase was quickly dubbed “Moore’s Law.”

Ten years later, in 1975, Moore reevaluated his prediction by assuming that the number of transistors on a silicon chip doubles every two years. And indeed, between 1971 and 2001, transistor density doubled every 1.96 years. As a result, electronic machines have become smaller and smaller, cheaper and faster and more powerful.

However, since 2015, this law has been debated. The Moore curve has stagnated to the point that some computer scientists predict its end for 2025. This is why some think about “more than Moore”, a term used since the early 2000s. It refers to the future of this law, to the “after”-miniaturization. This involves superimposing transistors, manufacturing 3D chips, integrating artificial intelligence, changing materials or even combining several dedicated functions per chip. The quantum computer, if not named, is part of this “more than Moore”.

We want to give thanks to the writer of this post for this incredible web content

The great promises of the quantum computer


You can find our social media accounts as well as other pages that are related to them.https://www.ai-magazine.com/related-pages/