# Research areas

The research of IOL is divided into 6 distinct areas, each led by experienced researchers. These areas encompass optimization theory, algorithm design, machine learning, operations research, combinatorics, quantum computing, and practical applications in energy, transportation, and healthcare. Select an area below to learn more about it.

## Continuous Optimization

We research and develop algorithms for continuous optimization, with a particular emphasis on solving high-dimensional optimization problems with first-order methods. We prove convergence rates and guarantees for a variety of settings, such as:
*Conditional gradient* aka *Frank-Wolfe* methods, which perform constrained optimization by accessing the feasible set via a linear optimization oracle.
*Proximal* methods, which yield efficient algorithms for nonsmooth optimization and practically reshape smooth ones.
*Online learning* algorithms which consist of playing a sequential game in which the algorithms predict and receive a loss in an online way and try to minimize overall regret; they have numerous applications to optimization via reductions.
*Accelerated* methods which combine online learning tools with other optimization techniques to exploit and improve convergence of other more simple algorithms, often to optimality.
And *block-iterative* algorithms which leverage parallelism and distributed computation for faster algorithmic performance.

### Associated projects

- Sparsity and Sample-size Efficiency in Structured Learning (MATH+ EF1-14)
- On a Frank-Wolfe Approach for Abs-smooth Optimization (MATH+ EF1-23)
- Beyond the Worst-case (MATH+ AA3-7)
- Theoretical Foundations of Deep Learning (SPP 2298, project number 441826958)
- Decision-making for Energy Network Dynamics (MATH+ AA4-7)

### Selected publications

- Kreimeier, T., Pokutta, S., Walther, A., and Woodstock, Z. (2023).
*On a Frank-Wolfe Approach for Abs-smooth Functions*. [arXiv]## [BibTeX]

- Martínez-Rubio, D., Wirth, E., and Pokutta, S. (2023). Accelerated and Sparse Algorithms for Approximate Personalized PageRank and Beyond.
*Proceedings of Annual Workshop on Computational Learning Theory*. [arXiv]## [BibTeX]

- Wirth, E., Kerdreux, T., and Pokutta, S. (2023). Acceleration of Frank-Wolfe Algorithms with Open Loop Step-sizes.
*Proceedings of International Conference on Artificial Intelligence and Statistics*. [arXiv]## [BibTeX]

- Búi, M. N., Combettes, P. L., and Woodstock, Z. (2022). Block-activated Algorithms for Multicomponent Fully Nonsmooth Minimization.
*Proceedings of ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)*, 5428–5432. DOI: 10.1109/ICASSP43922.2022.9747479 [URL]## [BibTeX]

- Braun, G., Pokutta, S., and Weismantel, R. (2022).
*Alternating Linear Minimization: Revisiting von Neumann’s Alternating Projections*. [arXiv] [slides] [video]## [BibTeX]

- Criado, F., Martínez-Rubio, D., and Pokutta, S. (2022). Fast Algorithms for Packing Proportional Fairness and Its Dual.
*Proceedings of Conference on Neural Information Processing Systems*. [arXiv] [poster]## [BibTeX]

- Hendrych, D., Troppens, H., Besançon, M., and Pokutta, S. (2022).
*Convex Integer Optimization with Frank-Wolfe Methods*. [arXiv] [slides] [code]## [BibTeX]

- Martínez-Rubio, D., and Pokutta, S. (2022). Accelerated Riemannian Optimization: Handling Constraints with a Prox to Bound Geometric Penalties.
*Proceedings of Optimization for Machine Learning (NeurIPS Workshop OPT 2022)*. [URL] [arXiv] [poster]## [BibTeX]

- Wirth, E., and Pokutta, S. (2022). Conditional Gradients for the Approximately Vanishing Ideal.
*Proceedings of International Conference on Artificial Intelligence and Statistics*. [arXiv] [summary] [poster] [code]## [BibTeX]

- Braun, G., and Pokutta, S. (2021).
*Dual Prices for Frank–Wolfe Algorithms*. [arXiv]## [BibTeX]

## Integer Optimization

We investigate theory and solution methods for mixed-integer optimization problems (MIPs). MIPs are challenging problems for which exact solution methods integrate several algorithmic components such as relaxations, separation through cutting planes, primal heuristics, conflict analysis, and pricing for column generation. Our group works on all these aspects and on their interaction within the constraint integer programming solver SCIP, which is based on a branch-cut-and-price algorithm. We also work on algorithms combining satisfiability and mixed-integer optimization techniques such as conflict analysis and domain propagation, and on exact MIP and linear programming solving, guaranteeing and certifying that solutions are not invalidated by floating-point arithmetic errors.

### Associated projects

- Adaptive Algorithms Through Machine Learning (MATH+ EF1-9)
- MiniMIP (miniMIP)
- Learning to Schedule Heuristics in IP

### Selected publications

- Chmiela, A., Gleixner, A., Lichocki, P., and Pokutta, S. (2023). Online Learning for Scheduling MIP Heuristics.
*Proceedings of International Conference on the Integration of Constraint Programming, Artificial Intelligence, and Operations Research*, 114–123. DOI: 10.1007/978-3-031-33271-5_8## [BibTeX]

- Eifler, L., and Gleixner, A. (2023). A Computational Status Update for Exact Rational Mixed Integer Programming.
*Mathematical Programming*,*197*, 793–812. DOI: 10.1007/s10107-021-01749-5 [URL]## [BibTeX]

- Turner, M., Berthold, T., Besançon, M., and Koch, T. (2023). Cutting Plane Selection with Analytic Centers and Multiregression.
*Proceedings of International Conference on the Integration of Constraint Programming, Artificial Intelligence, and Operations Research*.## [BibTeX]

- Eifler, L., Gleixner, A., and Pulaj, J. (2022). A Safe Computational Framework for Integer Programming Applied to Chvátal’s Conjecture.
*ACM Transactions on Mathematical Software*,*48*(2), 1–12. DOI: 10.1145/3485630## [BibTeX]

- Sofranac, B., Gleixner, A., and Pokutta, S. (2022). Accelerating Domain Propagation: An Efficient GPU-parallel Algorithm Over Sparse Matrices.
*Parallel Computing*,*109*, 102874. DOI: 10.1016/j.parco.2021.102874 [arXiv] [summary]## [BibTeX]

- Eifler, L., and Gleixner, A.
*Safe and Verified Gomory Mixed Integer Cuts in a Rational MIP Framework*. [URL]## [BibTeX]

## Global Optimization

We aim at developing efficient global optimization methods for convex and nonconvex mixed-integer nonlinear programming (MINLP) problems. MINLP encompasses a wide range of optimization problems arising in chemical engineering, power systems operation and planning, layout design, and water network optimization, to name but a few. The emphasis of our work is on improving the performance of spatial branch-and-bound algorithms, which provide a general-purpose approach to finding global optima and bring together solving techniques such as relaxations, primal heuristics, presolving procedures and others. A particular focus is on creating new relaxation schemes and domain reduction techniques, which provide dual bounds that serve as optimality proofs.

Our research is closely linked with the development of the nonlinear functionality of the constraint integer programming solver SCIP. SCIP implements a spatial branch-and-cut algorithm that can handle general MINLP problems. Its plugin-based structure allows easy addition of new expression types and new solving techniques. SCIP is one of the leading MINLP solvers.

### Associated projects

- Convex Solver Adaptivity for Mixed-integer Optimization (MATH+ AA3-15)
- Quantum Computing and Integer Programming

### Selected publications

- Bestuzheva, K., Besançon, M., Chen, W.-K., Chmiela, A., Donkiewicz, T., van Doornmalen, J., Eifler, L., Gaul, O., Gamrath, G., Gleixner, A., Gottwald, L., Graczyk, C., Halbig, K., Hoen, A., Hojny, C., van der Hulst, R., Koch, T., Lübbecke, M., Maher, S. J., … Witzig, J. (2023). Enabling Research Through the SCIP Optimization Suite 8.0.
*ACM Transactions on Mathematical Software*. DOI: 10.1145/3585516 [arXiv]## [BibTeX]

- Bestuzheva, K., Chmiela, A., Müller, B., Serrano, F., Vigerske, S., and Wegscheider, F. (2023).
*Global Optimization of Mixed-integer Nonlinear Programs with SCIP 8.0*. [URL] [arXiv]## [BibTeX]

- Bestuzheva, K., Gleixner, A., and Achterberg, T. (2023). Efficient Separation of RLT Cuts for Implicit and Explicit Bilinear Products.
*Proceedings of Conference on Integer Programming and Combinatorial Optimization*, 14–28. DOI: 10.1007/978-3-031-32726-1_2 [arXiv]## [BibTeX]

- Chmiela, A., Muñoz, G., and Serrano, F. (2023). Monoidal Strengthening and Unique Lifting in MIQCPs.
*Proceedings of Conference on Integer Programming and Combinatorial Optimization*. [URL]## [BibTeX]

- Bestuzheva, K., Gleixner, A., and Völker, H. (2022). Strengthening SONC Relaxations with Constraints Derived From Variable Bounds.
*Proceedings of Proceedings of the Hungarian Global Optimization Workshop HUGO 2022*, 41–44. [URL]## [BibTeX]

- Chmiela, A., Muñoz, G., and Serrano, F. (2022). On the Implementation and Strengthening of Intersection Cuts for QCQPs.
*Mathematical Programming B*,*197*, 549–586. DOI: 10.1007/s10107-022-01808-5## [BibTeX]

- Ramin, E., Bestuzheva, K., Gargalo, C., Ramin, D., Schneider, C., Ramin, P., Flores-Alsina, X., Andersen, M., and Gernaey, K. (2021). Incremental Design of Water Symbiosis Networks with Prior Knowledge: the Case of an Industrial Park in Kenya.
*Science of the Total Environment*,*751*. DOI: 10.1016/j.scitotenv.2020.141706## [BibTeX]

## Robust and Explainable Learning

Machine learning has revolutionised the process of analyzing data and has led to
new insights and applications. However, one of the key short comings of this field
is its use of black box models which maylack explainability.

A key aspect of our research is to develop interpretable machine learning algorithms.
We do so by using techniques such as adversarial optimization, sparsity, etc..
We also focus on the interplay between learning and optimization by
developing new algorithms to train machine learning models and
exploting machine learning models to improve the process of optimization.
Finally, we also aim to identify the theoretical reasons behind the success of such models.

### Associated projects

- Adaptive Algorithms Through Machine Learning (MATH+ EF1-9)
- Expanding Merlin-Arthur Classifiers (MATH+ EF1-67)
- Learning to Schedule Heuristics in IP
- Globally Optimal Neural Network Training (SPP 2298, project number 463910157)
- AI-Based High-Resolution Forest Monitoring

### Selected publications

- Zimmer, M., Spiegel, C., and Pokutta, S. (2023). How I Learned to Stop Worrying and Love Retraining.
*Proceedings of International Conference on Learning Representations*. [arXiv] [code]## [BibTeX]

- Kossen, T., Hirzel, M. A., Madai, V. I., Boenisch, F., Hennemuth, A., Hildebrand, K., Pokutta, S., Sharma, K., Hilbert, A., Sobesky, J., Galinovic, I., Khalil, A. A., Fiebach, J. B., and Frey, D. (2022). Towards Sharing Brain Images: Differentially Private TOF-MRA Images with Segmentation Labels Using Generative Adversarial Networks.
*Frontiers in Artificial Intelligence*. DOI: 10.3389/frai.2022.813842## [BibTeX]

- Macdonald, J., Besançon, M., and Pokutta, S. (2022). Interpretable Neural Networks with Frank-Wolfe: Sparse Relevance Maps and Relevance Orderings.
*Proceedings of International Conference on Machine Learning*. [arXiv] [poster] [video]## [BibTeX]

- Nohadani, O., and Sharma, K. (2022). Optimization Under Connected Uncertainty.
*INFORMS Journal on Optimization*. DOI: 10.1287/ijoo.2021.0067## [BibTeX]

- Wäldchen, S., Huber, F., and Pokutta, S. (2022). Training Characteristic Functions with Reinforcement Learning: XAI-methods Play Connect Four.
*Proceedings of International Conference on Machine Learning*. [arXiv] [poster] [video]## [BibTeX]

- Wäldchen, S., Sharma, K., Zimmer, M., Turan, B., and Pokutta, S. (2022).
*Merlin-Arthur Classifiers: Formal Interpretability with Interactive Black Boxes*. [arXiv]## [BibTeX]

- Combettes, C., Spiegel, C., and Pokutta, S. (2020).
*Projection-free Adaptive Gradients for Large-scale Optimization*. [arXiv] [summary] [code]## [BibTeX]

- Pokutta, S., Spiegel, C., and Zimmer, M. (2020).
*Deep Neural Network Training with Frank-Wolfe*. [arXiv] [summary] [code]## [BibTeX]

## Computational Mathematics

Computers and Artificial Intelligence have always been an important tool for mathematicians, allowing one to gather data and extrapolate connections to formulate conjectures, exhaustively execute case analysis too large to be done by hande, solve underlying optimization problems, or to formalize and verify proofs. We are interested in exploring the many ways in which computational tools can be used to advance mathematics and formulate novel results by leveraging our unique knowledge of and access to the ZIB’s computational resources.

### Associated projects

- Learning Extremal Structures in Combinatorics (MATH+ EF1-12)
- Scaling Up Flag Algebras in Combinatorics (MATH+ EF1-21)

### Selected publications

- Parczyk, O., Pokutta, S., Spiegel, C., and Szabó, T. (2023). Fully Computer-assisted Proofs in Extremal Combinatorics.
*Proceedings of AAAI Conference on Artificial Intelligence*. [arXiv] [slides] [code]## [BibTeX]

- Rué Perna, J. J., and Spiegel, C. (2023). The Rado Multiplicity Problem in Vector Spaces Over Finite Fields.
*Proceedings of European Conference on Combinatorics*. [arXiv] [code]## [BibTeX]

- Braun, G., Pokutta, S., and Weismantel, R. (2022).
*Alternating Linear Minimization: Revisiting von Neumann’s Alternating Projections*. [arXiv] [slides] [video]## [BibTeX]

- Kamčev, N., and Spiegel, C. (2022). Another Note on Intervals in the Hales-Jewett Theorem.
*Electronic Journal of Combinatorics*,*29*(1). DOI: 10.37236/9400 [URL] [arXiv]## [BibTeX]

## Quantum Mathematics

We are interested in exploring the multifaceted aspects of quantum computation and quantum information theory. The goal is to the development and application of new algorithms and quantum-inspired methods for solving problems in various fields such as dynamical systems, quantum nonlocality, and supervised learning. Our research encompasses a wide range of topics and utilizes a variety of mathematical tools, including but not limited to: tensor decompositions and tensor-based algorithms, convex optimization methods, operator theory (in particular transfer operators), numerical methods for partial differential equations, data-driven and kernel-based techniques.

Through our work, we aim to advance knowledge and understanding in the field of quantum science and technology by developing mathematically profound methods which exploit the potential of quantum mechanical concepts.

### Associated projects

- Quantum Computing and Integer Programming
- Quantum Algorithms for Optimization
- Quantenbeschleunigte Optimierung Für Industrielle Anwendungen

### Selected publications

- Designolle, S., Iommazzo, G., Besançon, M., Knebel, S., Gelß, P., and Pokutta, S. (2023).
*Improved Local Models and New Bell Inequalities Via Frank-Wolfe Algorithms*. [arXiv]## [BibTeX]

- Gelß, P., Klein, R., Matera, S., and Schmidt, B. (2023).
*Quantum Dynamics of Coupled Excitons and Phonons in Chain-like Systems: Tensor Train Approaches and Higher-order Propagators*. [arXiv]## [BibTeX]

- Riedel, J., Gelß, P., Klein, R., and Schmidt, B. (2023). WaveTrain: A Python Package for Numerical Quantum Mechanics of Chain-like Systems Based on Tensor Trains.
*The Journal of Chemical Physics*,*158*(16), 164801. DOI: 10.1063/5.0147314 [URL] [arXiv]## [BibTeX]

- Stengl, S.-M. (2023).
*An Alternative Formulation of the Quantum Phase Estimation Using Projection-based Tensor Decompositions*. [arXiv]## [BibTeX]

- Gelß, P., Klus, S., Shakibaei, Z., and Pokutta, S. (2022).
*Low-rank Tensor Decompositions of Quantum Circuits*. [arXiv]## [BibTeX]