AI+Music
Situated in a UNESCO City of Music with and a vibrant hub for the music industry, our research operates at the intersection of artificial intelligence (AI) and music technology.
Our core mission is to design and develop trustworthy AI systems that support creatives and deepen our understanding and appreciation of music. We create computational methods to represent, analyse, and connect musical knowledge in ways that support human creativity, enhance personalised listening experiences, and preserve our rich global musical heritage. Our work in Music Information Retrieval (MIR) focuses on core challenges like music segmentation, emotion recognition, and pattern detection from both audio and symbolic music. This feeds into the development of intelligent systems that leverage the interoperability of large-scale music data, making musical knowledge more accessible and engaging for everyone.
Our research
Our research is organised around 4 interconnected themes that bridge fundamental AI challenges with real-world musical applications.
Music Information Retrieval (MIR)
We develop machine learning models to unlock the secrets hidden within music. This involves automatically analysing musical content and context to recognise structural patterns, identify emotional cues, and detect similarities between tracks. This research powers the next generation of personalised music experiences and systems for computational creativity and music discovery.
Responsible Music AI
The rise of Generative AI has created both exciting opportunities and ethical concerns for the creative sector. Our research into Responsible Music AI is focused on designing creative systems that empower artists, not replace them. We build tools that respect the role of human creators, ensuring their agency and ownership are preserved in the creative process.
Music Knowledge Engineering
A vast wealth of musical knowledge is scattered across the web in different formats and modalities – from liner notes and artist biographies to song analyses and historical events. Our work addresses this fragmentation by engineering multimodal knowledge graphs. We create structured, interconnected digital representations of musical data to enhance its accessibility, interoperability, and potential for reuse in new technologies.
Musical Heritage
We apply our work to the cultural heritage sector to preserve and enrich our understanding of music history. We collaborate with cultural institutions to create "digital twins" of their collections. This unlocks new educational pathways and enables innovative ways for visitors, scholars, and the public to explore the intricate relationships between artists, songs, and historical events, fostering a deeper connection to our shared musical legacy!
Collaborations and funding
Our research is deeply interdisciplinary and strengthened by a network of strategic partnerships.
- Interdisciplinary Collaborations at University of Liverpool: We work closely with experts across the University of Liverpool, with research collaborations in the Department of Music, the School of Law and Social Justice, and the School of Architecture
- We are proud to collaborate with the British Music Experience (BME), Liverpool's national museum of popular music. This partnership provides a unique real-world setting to develop and evaluate AI technologies designed to enhance visitor engagement with musical heritage
- University of Manchester Centre for Robotics and AI
- School of Informatics, King’s College London
- Centre for Digital Music (C4DM), Queen Mary University of London
- Digital Research Infrastructure for the Arts and Humanities (DARIAH)
Funders
- UK Research and Innovation (UKRI)
- Arts and Humanities Research Council (AHRC)
-
- EPSRC Robotics & Autonomous Systems Network.
Opportunities
We are actively seeking motivated students to work with us. We offer a range of interdisciplinary projects across our core research areas, often in collaboration with our partners. Opportunities include undergraduate and postgraduate final year projects, group projects, and research internships.
We also organise hackathons and open research events! See for example the last "Unlocking the BME with AI" Hackathon, where students created new technologies using the museum's collection.
Outputs and impact
- Unlocking the BME with knowledge graphs (AHRC+EPSRC): We are building an interconnected, multimodal knowledge graph from the British Music Experience. This foundational resource will power the next generation of AI solutions, providing new ways to explore and experience musical heritage
- Upskilling cultural robots with music intelligence (UK-RAS): we are collaborating with researchers from the University of Manchester to embody music intelligence in cultural robots
- Responsible AI music framework: In collaboration with various music stakeholders, we are identifying key features required to build trustworthy creative systems. This work aims to develop a framework for AI music solutions that are transparent, fair, and ethical by design, ensuring that technology serves to augment, not replace, human creativity.
Selected Publications
- https://www.nature.com/articles/s41597-023-02410-w
- https://ieeexplore.ieee.org/document/9787343
- https://dl.acm.org/doi/10.1145/3543507.3587428
- https://archives.ismir.net/ismir2017/paper/000036.pdf
- https://archives.ismir.net/ismir2022/paper/000010.pdf
- https://arxiv.org/abs/2503.18814.
People
- Dr. Jacopo de Berardinis
- Dr. Shengchen Li (XJTLU)
- For more information about AI+Music research, collaborations, or opportunities, please contact: jacodb@liverpool.ac.uk.