Skip to main content

Research

There are many natural phenomena governed by the strong interaction that cannot be described and understood inside the perturbative approach of QCD -where the Feynman diagrams are commonly employed - like the hadron spectroscopy, the QCD phase diagram, and all the phenomena related to the confinement property of the QCD. For many of these, non-pertubative phenomena, numerical simulation inside the framework of lattice QCD (LQCD), in which the theory is discretized from first principle, has been proved to be highly successful.

But still, many phenomena are impossible to be studied even inside the framework of LQCD, because for many of them, the associated partition function cannot be written without using a complex phase, whose presence, break down the well-established Monte Carlo techniques, this is known as the sign problem [1].

In an extremely tiny nutshell, the so-called sign problem is present in any study that involvs the integration of a highly oscillatory function, like for example the the path integral. In this cases the integration require an exponential computing time on a classical computer.

The sign problem could be in many cases avoided using advanced techniques or using dual effective models with positive Boltzmann weight, obtained by considering only the QCD degrees of freedom relevant to the phenomena. But all these approaches, still have a great limitation, in fact no dynamical strong interaction process in real time, like collision or propagation, can be simulated.

In the last few years, a new pioneering field related to the existence of prototype of quantum computer, seems to be promising in the possibility of avoid the above limitation and explore dynamical processes due to the possibility of increasing the Hilbert space without the need of huge computational resources.

Quantum computers are expected to dramatically expand our computational approach to understand the fundamental laws of nature.

In point of fact, a fault-tolerant quantum computer will offer the possibility to study not only the Quantum Chromodynamics but all lattice gauge theories (LGT) non-perturbative phenomena that are not possible to be numerically studied today using the standard Monte Carlo method. The study of the early universe, a neutron start core, strong interacting matter in presence of a chemical potential, the QCD phase diagram, or simply a LGT real-time evolution and most important the collider physics are some of the interesting topics affected by the sign problem.

The key advantage of using a quantum computer resides in its ability of storing the Hilbert space in an exponentially more efficient way compared to a classical computer [2]. This gives one the possibility to use the Hamiltonian formulation of a theory that is completely free of the sign problem limitations. But the current available quantum computers are not yet fault-tolerant, they are noisy and have small-scale quantum resources. Hence only theories on small lattice sizes can be investigated and error mitigation techniques must be applied to reduce the error.

However, we should keep our enthusiasm tempered because those fascinating future studies are not yet possible on currently publicly available quantum computers because they are not yet fault-tolerant, they are still noisy, lack error correction, and have small-scale quantum resources, we are in fact in the "Noisy Intermediate-Scale Quantum" (NISQ) era as John Preskill defined it [3]. Hence, as for the last 6-7 years and maybe for the next decades, only theories on
small lattice sizes can be studied and error mitigation techniques have to be applied to reduce errors and in general quantum resources must be highly rationalized to keep the noise level as low as possible.

These current technical limitations that force researchers to mainly study simplified theories on small lattices are nevertheless beneficial, because they stimulate a deeper understanding of the theory that could lead to the development of new approaches. Furthermore, the development of new quantum algorithms and their use on quantum hardware allows the research community to gain experience in quantum simulation, and ultimately take the opportunity to drive future hardware development toward the needs of the community, similarly to what happened with the development of supercomputers.


[1] M. Troyer and U.-J. Wiese, Computational complexity and fundamental limitations to fermionic quantum monte carlo simulations. Phys. Rev. Lett. 94 (2005) 170201 [cond-mat/0408370]
[2] C. W. Bauer et al., Quantum simulation for High Energy Physics [2204.03381]
[3] J. Preskill. “Quantum Computing in the NISQ era and beyond”. Quantum 2 (2018). doi: 10.22331/q-2018-08-06-79. arXiv: 1801.00862 [quant-ph]