Large-scale Tensorially Structured Data for Quantum Technologies and Beyond ongoing

Tensors networks are powerful tools for describing quantum systems, probabilistic modeling, and learning dynamical laws. In this project, we make significant strides in advancing these methods to enable large-scale simulations of quantum systems and beyond and explore new areas of applicability.

MATH+ AA-Tech-2
Jul 2025 to Jun 2027
3

🧑‍🎓 Project Members

Jens Eisert
Principal Investigator
jense (at) zedat.fu-berlin.de
Michael HintermĂĽller
Principal Investigator
hint (at) mathematik.hu-berlin.de
Patrick GelĂź
Principal Investigator
gelss (at) zib.de

🪙 Funding

This project is being funded by the Berlin Mathematics Research Center MATH+ (project ID AA-Tech-2), itself funded by the German Research Foundation (DFG) under Germany's Excellence Strategy (EXC-2046/1, project ID 390685689) from July 2025 to June 2027.

🔬 Project Description

Tensor networks have long been recognized as a powerful tool for efficiently representing operators or data with a suitable structure. These approaches decompose a large tensor into smaller building blocks, allowing the overall value to be computed through tensor network contractions. Such methods are central in studying complex quantum systems and condensed matter theory, where geometric locality is key. In fact, one-dimensional tensor networks — also called tensor trains (in mathematics) or matrix product states (in physics) — serve as a parameterization for one-dimensional phases of matter in physics. There are deep reasons for this: This is because one-dimensional tensor networks with small bond dimensions capture the natural correlation patterns in these systems. For higher-dimensional systems, analogous ansatz families have been developed. The study of tensor networks, rooted in density matrix renormalization group methods, is a cornerstone of classical modeling in condensed matter physics. This project aims to develop new classical simulation algorithms for simulating noisy quantum circuits at an unprecedented scale. It also proposes new approaches for dissipative engineering of variational quantum circuits and explores the use of large-scale tensor networks in machine learning. Additionally, it investigates the efficient contraction of tensor networks on a quantum computer—an issue known to have complexity-theoretic obstacles for classical computers—and examines learning problems over function dictionaries using tensor networks.