Research
Many natural phenomena governed by the strong interaction lie beyond the reach of perturbative Quantum Chromodynamics (QCD), where Feynman diagrams are typically used. These include hadron spectroscopy, the QCD phase diagram, and all phenomena associated with confinement -- an essential feature of QCD. For many such non-perturbative phenomena, numerical simulations within the framework of lattice QCD (LQCD), which discretizes the theory from first principles, have proven highly successful.
However, there remain significant challenges. Several phenomena cannot be studied even within LQCD due to the so-called sign problem. In these cases, the associated partition function involves a complex phase, which renders standard Monte Carlo techniques ineffective [1].
In simple terms, the sign problem arises when evaluating integrals of highly oscillatory functions—such as the path integral in quantum field theory. These integrals require computational resources that grow exponentially with system size, making them infeasible on classical computers.
While various advanced techniques and effective dual models with positive Boltzmann weights can sometimes circumvent the sign problem by focusing on specific, relevant degrees of freedom, these approaches still face significant limitations. Crucially, they are unable to simulate real-time dynamical processes involving strong interactions -- such as particle collisions, propagation, or processes governed by a finite chemical potential, which are essential for understanding the matter-antimatter asymmetry and the conditions of the early universe.
In recent years, the emergence of prototype quantum computers has opened a new and promising path forward. Quantum computing offers a fundamentally different approach by allowing the exploration of dynamical processes through an exponentially larger Hilbert space without the need for immense classical computational power.
Quantum computers are poised to revolutionize our ability to study non-perturbative aspects of QCD and other lattice gauge theories (LGTs). A fully fault-tolerant quantum computer could enable the simulation of real-time evolution, collider physics, the early universe, neutron star cores, and strongly interacting matter at finite chemical potential -- phenomena that are currently inaccessible due to the sign problem.
The key advantage of quantum computing lies in its exponentially efficient storage and manipulation of the Hilbert space [2]. This makes it possible to use Hamiltonian formulations of quantum field theories that are inherently free of the sign problem.
Nevertheless, current quantum devices are still in their infancy. They are not yet fault-tolerant, suffer from noise, and possess only limited quantum resources. We are currently in what John Preskill has termed the Noisy Intermediate-Scale Quantum (NISQ) era [3]. As a result, only simplified models on small lattice sizes can be studied, and sophisticated error mitigation techniques must be employed to extract meaningful results.
While these limitations constrain the scope of present-day research, they also present valuable opportunities. Working within these constraints encourages deeper theoretical insight and drives the development of new quantum algorithms. Moreover, by actively using quantum hardware, the research community gains crucial experience in quantum simulation and can help shape the design of future quantum systems, much like the co-evolution of supercomputers and computational physics.
[1] M. Troyer and U.-J. Wiese, Computational complexity and fundamental limitations to fermionic quantum monte carlo simulations. Phys. Rev. Lett. 94 (2005) 170201 [cond-mat/0408370]
[2] C. W. Bauer et al., Quantum simulation for High Energy Physics [2204.03381]
[3] J. Preskill. “Quantum Computing in the NISQ era and beyond”. Quantum 2 (2018). doi: 10.22331/q-2018-08-06-79. arXiv: 1801.00862 [quant-ph]