Artificial intelligence at the edge of discovery
Particle physicists seek to answer some of the most profound questions about matter, energy, and the fundamental structure of the universe. Answering these questions requires not just powerful particle accelerators like the Large Hadron Collider (LHC) at CERN, but also cutting-edge technologies to understand and interpret the data it generates.
Particle physics has long been in the age of big data, and artificial intelligence (AI) is now a critical technology. Our researchers have been at the forefront of this transformation for over a decade, pioneering the use of machine learning (ML) and AI to accelerate discovery across key experiments at the LHC.
Finding new physics
The ATLAS experiment, one of the two general-purpose detectors at the LHC, generates vast volumes of data from proton–proton collisions. To unlock new physics from this data, Liverpool physicists are using AI/ML to push the boundaries of discovery.
Professor Monica D’Onofrio, Head of Research at the Department of Physics and now incoming ATLAS-UK spokesperson, has pioneered AI-driven approaches in ATLAS searches for physics beyond the Standard Model. Along with researchers and post-graduate students in the team, in 2023 and 2024 she published searches applying AI tools to search for dark matter candidates as predicted by supersymmetry, and new particles from more complex hidden (dark) sectors. The AI tools helped identifying complex collision patterns in challenging environments, where traditional analysis methods struggle.
As co-lead of the project “MUCCA” (Multi-disciplinary Use Cases for Convergent new Approaches to AI explainability”), She aims to make AI methods more transparent and explainable (xAI) as well as extend their applicability in high energy physics (HEP) and beyond. An article based on her work with collaborators within the MUCCA consortium, deomonstrates how these methods can really improve sensitivity of collider searches to rare signals through graph neural networks and innovative built-in xAI approaches.
AI/ML techniques are not only relevant for searches for new physics but are key in identifying short-lived particles like beauty-quark flavoured hadrons and tau leptons, which are crucial for understanding the Higgs Boson particle. Among others, Professor Carl Gwilliam and Dr Jordy Degens are leading efforts to test and validate these methods targeting the rare production of Higgs Boson pairs at ATLAS. This work is crucial for advancing our understanding of the early universe. It is a primary focus of the upcoming LHC runs and the future circular collider (FCC), as highlighted at the recent European Strategy for Particle Physics symposium, with work carried out by Liverpool physicists using AI/ML tools.
Breaking the limits of the Standard Model
The LHCb experiment is investigating a type of elementary particle called the beauty quark, whose behaviour lets us understand various questions in fundamental physics, such as the mysteries of matter and antimatter asymmetry which is responsible for the evolution of the universe.
With the upgraded LHCb detector now coping with up to 30 million events per second. AI and ML techniques enable us to analyse these collisions in real-time and help identify interesting events and measure complex decay patterns with incredible precision.
Under the leadership of Dr Eduardo Rodrigues, Senior Research Physicist in the Department of Physics, Liverpool led the overhaul of LHCb’s offline data processing framework between 2020 and 2024. The new system fully capitalises on the detector’s increased data output, ensuring that no valuable signals are missed.
The LHCb experiment explored Quantum Machine Learning (QML) for identifying the charge of b-quark jets. In 2022, Dr Rodrigues was awarded a prestigious CERN Scientific Associateship, enabling collaboration with leading researchers and access to cutting-edge facilities. With AI central to LHCb data analysis and quantum technologies rapidly advancing, the team is now investigating how quantum algorithms could be executed on emerging hardware and applied to particle physics challenges.
AI and ML are at the heart of LHCb’s data analysis. Given the rapid progress of quantum computers and quantum technologies, we now face the complex challenge of investigating if and how quantum algorithms can be executed on such new hardware, and whether the LHCb particle physics use-cases can benefit from the new technology and paradigm that is Quantum Computing.
Beyond algorithms
Liverpool researchers are also innovating in the infrastructure that powers modern physics research. In collaboration with Intel, our team is testing oneAPI, a programming platform that simplifies the development of applications on specialised chips ideal for real-time tasks called FPGAs. Dr Karol Hennessy and Dr Kurt Rinnert are investigating the use of High Bandwidth Memory (HBM) on these devices to enable more complex algorithms, such as real-time detector alignment and calibration.
At LHCb, the lowest-level event trigger, responsible for deciding which collisions are worth keeping, now runs on Nvidia GPUs. This system handles an astonishing 40 terabits of raw data per second, currently the highest collection data rate of a high-energy physics detector. Members of our team helped implement this system and now lead efforts to reuse this powerful GPU infrastructure for offline analysis.
AI for discovery and the wider society
As particle physics ventures deeper into the unknown, AI is helping scientists ask better questions and get faster, clearer answers. Engaging to better exploit AI methods for particle physics and beyond is crucial, and Liverpool’s contribution spans the full pipeline.
Professor D’Onofrio contributes to long-term AI strategies for particle physics in the UK and internationally. She co-coordinates efforts around “Enabling AI for High Energy Physics (HEP)” (PDF, 485 KB), a national initiatiqve supporting the use of AI in theoretical and experimental HEP. The effort not only promotes wider AI adoption in HEP but also strengthens knowledge exchange and industry engagement. Dr Rodriguez is equally invested in the AI for HEP and other EU initiatives for sustainable computing software development and training of next generations.
Beyond particle physics, Liverpool is engaged in development of AI/ML tools, from quantum technologies to beam diagnostic in accelerator science, studies of cold atom crystals, and usage of medical images for cancer diagnosis and prognosis. Increasingly, projects applying AI to healthcare and broader societal challenges are emerging from HEP. The theme of AI and Data Science has been fully embraced in the training of next generations, with the Centre of Doctoral Networks LIV.INNO and previously LIV.DAT initiatives, led by Professor Carsten Welsch.
Whether it’s searching for dark matter, understanding the Higgs boson or tackling pressing medical challenges, we develop tools that not only bring us closer to solving the greatest mysteries in science, but could one day transform society for the better.