Max Zimmer

My research interests focus on Deep Learning, Efficient ML, Optimization and Sustainable AI. Specifically, I work on enhancing the efficiency of large Neural Networks through sparsity, pruning, quantization, and low-rank optimization. Additionally, I am interested in Federated Learning, Interpretability & Fairness, as well as applying ML to solve pure mathematics problems. Recently, I started using Deep Learning methods on satellite imagery to create high-resolution surface maps of the globe, allowing to track climate-relevant metrics such as above ground biomass.

📬 Contact

office
Room 3037 at ZIB
e-mail
homepage
maxzimmer.org
languages
German, English, and Italian

🎓 Curriculum vitae

since 2022
Member of BMS
since 2021
Researcher at ZIB
summer 2020 to 2021
Research Assistant at ZIB
Jun 2021
M.Sc. in Mathematics at TUB
Aug 2017
B.Sc. in Mathematics at TUB

đź“ť Publications and preprints

Preprints

  1. Mundinger, K., Zimmer, M., and Pokutta, S. (2024). Neural Parameter Regression for Explicit Representations of PDE Solution Operators. [arXiv]
    [BibTeX]
    @misc{2024_MundingerZimmerPokutta_Neuralparameterregression,
      archiveprefix = {arXiv},
      eprint = {2403.12764},
      primaryclass = {cs.LG},
      year = {2024},
      author = {Mundinger, Konrad and Zimmer, Max and Pokutta, Sebastian},
      title = {Neural Parameter Regression for Explicit Representations of PDE Solution Operators}
    }
  2. Roux, C., Zimmer, M., and Pokutta, S. (2024). On the Byzantine-resilience of Distillation-based Federated Learning. [arXiv]
    [BibTeX]
    @misc{2024_RouxZimmerPokutta_Byzantineresilience,
      archiveprefix = {arXiv},
      eprint = {2402.12265},
      primaryclass = {cs.LG},
      year = {2024},
      author = {Roux, Christophe and Zimmer, Max and Pokutta, Sebastian},
      title = {On the Byzantine-resilience of Distillation-based Federated Learning}
    }
  3. Zimmer, M., Andoni, M., Spiegel, C., and Pokutta, S. (2023). PERP: Rethinking the Prune-Retrain Paradigm in the Era of LLMs. [arXiv] [code]
    [BibTeX]
    @misc{2023_ZimmerAndoniSpiegelPokutta_PerpPruneRetrain,
      archiveprefix = {arXiv},
      eprint = {2312.15230},
      primaryclass = {cs.CL},
      year = {2023},
      author = {Zimmer, Max and Andoni, Megi and Spiegel, Christoph and Pokutta, Sebastian},
      title = {PERP: Rethinking the Prune-Retrain Paradigm in the Era of LLMs},
      code = {https://github.com/ZIB-IOL/PERP}
    }
  4. Zimmer, M., Spiegel, C., and Pokutta, S. (2022). Compression-aware Training of Neural Networks Using Frank-Wolfe. [arXiv]
    [BibTeX]
    @misc{2022_ZimmerSpiegelPokutta_Compressionawaretraining,
      archiveprefix = {arXiv},
      eprint = {2205.11921},
      primaryclass = {cs.LG},
      year = {2022},
      author = {Zimmer, Max and Spiegel, Christoph and Pokutta, Sebastian},
      title = {Compression-aware Training of Neural Networks Using Frank-Wolfe}
    }
  5. Pokutta, S., Spiegel, C., and Zimmer, M. (2020). Deep Neural Network Training with Frank-Wolfe. [arXiv] [summary] [code]
    [BibTeX]
    @misc{2020_PokuttaSpiegelZimmer_Frankwolfeneuralnetworks,
      archiveprefix = {arXiv},
      eprint = {2010.07243},
      primaryclass = {cs.LG},
      year = {2020},
      author = {Pokutta, Sebastian and Spiegel, Christoph and Zimmer, Max},
      title = {Deep Neural Network Training with Frank-Wolfe},
      code = {https://github.com/ZIB-IOL/StochasticFrankWolfe},
      summary = {https://pokutta.com/blog/research/2020/11/11/NNFW.html}
    }

Conference proceedings

  1. Wäldchen, S., Sharma, K., Turan, B., Zimmer, M., and Pokutta, S. (2024). Interpretability Guarantees with Merlin-Arthur Classifiers. Proceedings of the International Conference on Artificial Intelligence and Statistics. [arXiv]
    [BibTeX]
    @inproceedings{2022_WaeldchenEtAl_Interpretabilityguarantees,
      year = {2024},
      booktitle = {Proceedings of the International Conference on Artificial Intelligence and Statistics},
      archiveprefix = {arXiv},
      eprint = {2206.00759},
      primaryclass = {cs.LG},
      author = {Wäldchen, Stephan and Sharma, Kartikey and Turan, Berkant and Zimmer, Max and Pokutta, Sebastian},
      title = {Interpretability Guarantees with Merlin-Arthur Classifiers}
    }
  2. Zimmer, M., Spiegel, C., and Pokutta, S. (2024). Sparse Model Soups: A Recipe for Improved Pruning Via Model Averaging. Proceedings of the International Conference on Learning Representations. [URL] [arXiv]
    [BibTeX]
    @inproceedings{2023_ZimmerSpiegelPokutta_Sparsemodelsoups,
      year = {2024},
      booktitle = {Proceedings of the International Conference on Learning Representations},
      url = {https://iclr.cc/virtual/2024/poster/17433},
      archiveprefix = {arXiv},
      eprint = {2306.16788},
      primaryclass = {cs.LG},
      author = {Zimmer, Max and Spiegel, Christoph and Pokutta, Sebastian},
      title = {Sparse Model Soups: A Recipe for Improved Pruning Via Model Averaging}
    }
  3. Mundinger, K., Pokutta, S., Spiegel, C., and Zimmer, M. (2024). Extending the Continuum of Six-Colorings. Proceedings of the Discrete Mathematics Days. [URL] [arXiv]
    [BibTeX]
    @inproceedings{2024_MundingerPokuttaSpiegelZimmer_SixcoloringsExpansion:1,
      year = {2024},
      booktitle = {Proceedings of the Discrete Mathematics Days},
      url = {https://dmd2024.web.uah.es/files/abstracts/paper_27.pdf},
      archiveprefix = {arXiv},
      eprint = {2404.05509},
      primaryclass = {math.CO},
      author = {Mundinger, Konrad and Pokutta, Sebastian and Spiegel, Christoph and Zimmer, Max},
      title = {Extending the Continuum of Six-Colorings}
    }
  4. Pauls, J., Zimmer, M., Kelly, U. M., Schwartz, M., Saatchi, S., Ciais, P., Pokutta, S., Brandt, M., and Gieseke, F. (2024). Estimating Canopy Height at Scale. Proceedings of the International Conference on Machine Learning. [arXiv] [code]
    [BibTeX]
    @inproceedings{2024_PaulsEtAl_Canopyheightestimation,
      year = {2024},
      booktitle = {Proceedings of the International Conference on Machine Learning},
      archiveprefix = {arXiv},
      eprint = {2406.01076},
      primaryclass = {cs.CV},
      author = {Pauls, Jan and Zimmer, Max and Kelly, Una M and Schwartz, Martin and Saatchi, Sassan and Ciais, Philippe and Pokutta, Sebastian and Brandt, Martin and Gieseke, Fabian},
      title = {Estimating Canopy Height at Scale},
      code = {https://github.com/AI4Forest/Global-Canopy-Height-Map}
    }
  5. Zimmer, M., Spiegel, C., and Pokutta, S. (2023). How I Learned to Stop Worrying and Love Retraining. Proceedings of the International Conference on Learning Representations. [URL] [arXiv] [code]
    [BibTeX]
    @inproceedings{2021_ZimmerSpiegelPokutta_Retrainingpruning,
      year = {2023},
      booktitle = {Proceedings of the International Conference on Learning Representations},
      url = {https://iclr.cc/virtual/2023/poster/10914},
      archiveprefix = {arXiv},
      eprint = {2111.00843},
      primaryclass = {cs.LG},
      author = {Zimmer, Max and Spiegel, Christoph and Pokutta, Sebastian},
      title = {How I Learned to Stop Worrying and Love Retraining},
      code = {https://github.com/ZIB-IOL/BIMP}
    }
  6. Ziemke, T., Sering, L., Vargas Koch, L., Zimmer, M., Nagel, K., and Skutella, M. (2020). Flows Over Time As Continuous Limits of Packet-based Network Simulations. Proceedings of the EURO Working Group on Transportation Meeting.
    [BibTeX]
    @inproceedings{2021_ZiemkeEtAl_Flowscontinuouslimits:1,
      year = {2020},
      booktitle = {Proceedings of the EURO Working Group on Transportation Meeting},
      author = {Ziemke, Theresa and Sering, Leon and Vargas Koch, Laura and Zimmer, Max and Nagel, Kai and Skutella, Martin},
      title = {Flows Over Time As Continuous Limits of Packet-based Network Simulations}
    }

Full articles

  1. Mundinger, K., Pokutta, S., Spiegel, C., and Zimmer, M. (2024). Extending the Continuum of Six-Colorings. Geombinatorics Quarterly, XXXIV. [URL] [arXiv]
    [BibTeX]
    @article{2024_MundingerPokuttaSpiegelZimmer_SixcoloringsExpansion,
      year = {2024},
      journal = {Geombinatorics Quarterly},
      volume = {XXXIV},
      url = {https://geombina.uccs.edu/past-issues/volume-xxxiv},
      archiveprefix = {arXiv},
      eprint = {2404.05509},
      primaryclass = {math.CO},
      author = {Mundinger, Konrad and Pokutta, Sebastian and Spiegel, Christoph and Zimmer, Max},
      title = {Extending the Continuum of Six-Colorings}
    }
  2. Ziemke, T., Sering, L., Vargas Koch, L., Zimmer, M., Nagel, K., and Skutella, M. (2021). Flows Over Time As Continuous Limits of Packet-based Network Simulations. Transportation Research Procedia, 52, 123–130. DOI: 10.1016/j.trpro.2021.01.014 [URL]
    [BibTeX]
    @article{2021_ZiemkeEtAl_Flowscontinuouslimits,
      year = {2021},
      journal = {Transportation Research Procedia},
      volume = {52},
      pages = {123-130},
      doi = {10.1016/j.trpro.2021.01.014},
      url = {https://sciencedirect.com/science/article/pii/S2352146521000284},
      author = {Ziemke, Theresa and Sering, Leon and Vargas Koch, Laura and Zimmer, Max and Nagel, Kai and Skutella, Martin},
      title = {Flows Over Time As Continuous Limits of Packet-based Network Simulations}
    }

🔬 Projects

AI-Based High-Resolution Forest Monitoring

Preserving global vegetation is crucial for addressing and mitigating climate change. Accurate, up-to-date forest health data is essential. AI4Forest aims to develop advanced AI methods to monitor forests using satellite imagery, including radar and optical data. The project will create scalable techniques for detailed, high-resolution maps of the globe, e.g., to monitor canopy height, biomass, and to track forest disturbances.

AI4Forest
Jun 2023 to May 2027
3
3

Research Campus MODAL SynLab

SynLab researches mathematical generalization of application-specific advances achieved in the Gas-, Rail– and MedLab of the research campus MODAL. The focus is on exact methods for solving a broad class of discrete-continuous optimization problems. This requires advanced techniques for structure recognition, consideration of nonlinear restrictions from practice, and the efficient implementation of mathematical algorithms on modern computer architectures. The results are bundled in a professional software package and complemented by a range of high-performance methods for specific applications with a high degree of innovation.

SynLab
Apr 2020 to Mar 2025
13
53

đź’¬ Talks and posters

Research seminar talks

Oct 2024
Sparse Model Soups: A Recipe for Improved Pruning Via Model Averaging
Research Seminar of the Machine Learning and Data Engineering Group at WWU, MĂĽnster
Nov 2023
Approaches to Neural Network Compression - Or: How I Learned To Stop Worrying and Love Retraining
AI4Forest Kick-off Meeting, Paris
Jun 2022
Sparsity in Neural Networks
Siemens Workshop at ZIB, Berlin

Poster presentations

Jul 2024
Estimating Canopy Height at Scale
41st International Conference on Machine Learning (ICML), Vienna
May 2024
Sparse Model Soups a Recipe for Improved Pruning Via Model Averaging
12th International Conference on Learning Representations (ICLR), Vienna
May 2023
How I Learned to Stop Worrying and Love Retraining
11th International Conference on Learning Representations (ICLR), Kigali
Mar 2023
How I Learned to Stop Worrying and Love Retraining
Workshop on Optimization and Machine Learning, Waischenfeld

đź“… Event Attendance