Max Zimmer
My research interests focus on Deep Learning, Efficient ML, Optimization and Sustainable AI. Specifically, I work on enhancing the efficiency of large Neural Networks through sparsity, pruning, quantization, and low-rank optimization. Additionally, I am interested in Federated Learning, Interpretability & Fairness, as well as applying ML to solve pure mathematics problems. Recently, I started using Deep Learning methods on satellite imagery to create high-resolution surface maps of the globe, allowing to track climate-relevant metrics such as above ground biomass.
📬 Contact
- office
- Room 3037 at ZIB
- zimmer (at) zib.de
- homepage
- maxzimmer.org
- languages
- German, English, and Italian
🎓 Curriculum vitae
- since 2022
- Member of BMS
- since 2021
- Researcher at ZIB
- summer 2020 to 2021
- Research Assistant at ZIB
- Jun 2021
- M.Sc. in Mathematics at TUB
- Aug 2017
- B.Sc. in Mathematics at TUB
đź“ť Publications and preprints
Preprints
- Mundinger, K., Zimmer, M., and Pokutta, S. (2024). Neural Parameter Regression for Explicit Representations of PDE Solution Operators.
[arXiv]
[BibTeX]
- Roux, C., Zimmer, M., and Pokutta, S. (2024). On the Byzantine-resilience of Distillation-based Federated Learning.
[arXiv]
[BibTeX]
- Zimmer, M., Andoni, M., Spiegel, C., and Pokutta, S. (2023). PERP: Rethinking the Prune-Retrain Paradigm in the Era of LLMs.
[arXiv]
[code]
[BibTeX]
- Zimmer, M., Spiegel, C., and Pokutta, S. (2022). Compression-aware Training of Neural Networks Using Frank-Wolfe.
[arXiv]
[BibTeX]
- Pokutta, S., Spiegel, C., and Zimmer, M. (2020). Deep Neural Network Training with Frank-Wolfe.
[arXiv]
[summary]
[code]
[BibTeX]
Conference proceedings
- Wäldchen, S., Sharma, K., Turan, B., Zimmer, M., and Pokutta, S. (2024). Interpretability Guarantees with Merlin-Arthur Classifiers. Proceedings of the International Conference on Artificial Intelligence and Statistics.
[arXiv]
[BibTeX]
- Zimmer, M., Spiegel, C., and Pokutta, S. (2024). Sparse Model Soups: A Recipe for Improved Pruning Via Model Averaging. Proceedings of the International Conference on Learning Representations.
[URL]
[arXiv]
[BibTeX]
- Mundinger, K., Pokutta, S., Spiegel, C., and Zimmer, M. (2024). Extending the Continuum of Six-Colorings. Proceedings of the Discrete Mathematics Days.
[URL]
[arXiv]
[BibTeX]
- Pauls, J., Zimmer, M., Kelly, U. M., Schwartz, M., Saatchi, S., Ciais, P., Pokutta, S., Brandt, M., and Gieseke, F. (2024). Estimating Canopy Height at Scale. Proceedings of the International Conference on Machine Learning.
[arXiv]
[code]
[BibTeX]
- Zimmer, M., Spiegel, C., and Pokutta, S. (2023). How I Learned to Stop Worrying and Love Retraining. Proceedings of the International Conference on Learning Representations.
[URL]
[arXiv]
[code]
[BibTeX]
- Ziemke, T., Sering, L., Vargas Koch, L., Zimmer, M., Nagel, K., and Skutella, M. (2020). Flows Over Time As Continuous Limits of Packet-based Network Simulations. Proceedings of the EURO Working Group on Transportation Meeting.
[BibTeX]
Full articles
- Mundinger, K., Pokutta, S., Spiegel, C., and Zimmer, M. (2024). Extending the Continuum of Six-Colorings. Geombinatorics Quarterly, XXXIV.
[URL]
[arXiv]
[BibTeX]
- Ziemke, T., Sering, L., Vargas Koch, L., Zimmer, M., Nagel, K., and Skutella, M. (2021). Flows Over Time As Continuous Limits of Packet-based Network Simulations. Transportation Research Procedia, 52, 123–130.
DOI: 10.1016/j.trpro.2021.01.014
[URL]
[BibTeX]
🔬 Projects
Preserving global vegetation is crucial for addressing and mitigating climate change. Accurate, up-to-date forest health data is essential. AI4Forest aims to develop advanced AI methods to monitor forests using satellite imagery, including radar and optical data. The project will create scalable techniques for detailed, high-resolution maps of the globe, e.g., to monitor canopy height, biomass, and to track forest disturbances.
SynLab researches mathematical generalization of application-specific advances achieved in the Gas-, Rail– and MedLab of the research campus MODAL. The focus is on exact methods for solving a broad class of discrete-continuous optimization problems. This requires advanced techniques for structure recognition, consideration of nonlinear restrictions from practice, and the efficient implementation of mathematical algorithms on modern computer architectures. The results are bundled in a professional software package and complemented by a range of high-performance methods for specific applications with a high degree of innovation.
đź’¬ Talks and posters
Research seminar talks
- Oct 2024
- Sparse Model Soups: A Recipe for Improved Pruning Via Model Averaging
Research Seminar of the Machine Learning and Data Engineering Group at WWU, MĂĽnster - Nov 2023
- Approaches to Neural Network Compression - Or: How I Learned To Stop Worrying and Love Retraining
AI4Forest Kick-off Meeting, Paris - Jun 2022
- Sparsity in Neural Networks
Siemens Workshop at ZIB, Berlin
Poster presentations
- Jul 2024
- Estimating Canopy Height at Scale
41st International Conference on Machine Learning (ICML), Vienna - May 2024
- Sparse Model Soups a Recipe for Improved Pruning Via Model Averaging
12th International Conference on Learning Representations (ICLR), Vienna - May 2023
- How I Learned to Stop Worrying and Love Retraining
11th International Conference on Learning Representations (ICLR), Kigali - Mar 2023
- How I Learned to Stop Worrying and Love Retraining
Workshop on Optimization and Machine Learning, Waischenfeld
đź“… Event Attendance
- May 2025
- 7th DOxML Conference, Kyoto
- Oct 2024
- Responsible AI Summit, Paris
- Jul 2024
- 41st International Conference on Machine Learning (ICML), Vienna
- May 2024
- 12th International Conference on Learning Representations (ICLR), Vienna
- May 2023
- 11th International Conference on Learning Representations (ICLR), Kigali
- Dec 2023
- Neurips@Paris
- Mar 2024
- ELLIS Winter School on Foundation Models