Max Zimmer

My research interests focus on Deep Learning, Efficient ML, and Optimization. Specifically, I work on enhancing the efficiency of large Neural Networks through sparsity, pruning, quantization, and low-rank optimization. Additionally, I am interested in Federated Learning, Explainability & Fairness, as well as applying ML to solve pure mathematics problems.

📬 Contact

office
Room 3024 at ZIB
e-mail
homepage
maxzimmer.org
languages
German, English, and Italian

🎓 Curriculum vitae

since 2021
Researcher at ZIB
summer 2020 to 2021
Research Assistant at ZIB
M.Sc. at TUB
Aug 2017
B.Sc. in Mathematics at TUB

đź“ť Publications and preprints

Preprints

  1. Mundinger, K., Zimmer, M., and Pokutta, S. (2024). Neural Parameter Regression for Explicit Representations of PDE Solution Operators. [arXiv]
    [BibTeX]
    @misc{NeuralRegressionPDE2024,
      archiveprefix = {arXiv},
      eprint = {2403.12764},
      primaryclass = {cs.LG},
      year = {2024},
      author = {Mundinger, Konrad and Zimmer, Max and Pokutta, Sebastian},
      title = {Neural Parameter Regression for Explicit Representations of PDE Solution Operators}
    }
  2. Roux, C., Zimmer, M., and Pokutta, S. (2024). On the Byzantine-resilience of Distillation-based Federated Learning. [arXiv]
    [BibTeX]
    @misc{byzantinefederated2024,
      archiveprefix = {arXiv},
      eprint = {2402.12265},
      primaryclass = {cs.LG},
      year = {2024},
      author = {Roux, Christophe and Zimmer, Max and Pokutta, Sebastian},
      title = {On the Byzantine-resilience of Distillation-based Federated Learning}
    }
  3. Zimmer, M., Andoni, M., Spiegel, C., and Pokutta, S. (2023). PERP: Rethinking the Prune-Retrain Paradigm in the Era of LLMs. [arXiv] [code]
    [BibTeX]
    @misc{zasp_perp_23,
      archiveprefix = {arXiv},
      eprint = {2312.15230},
      primaryclass = {cs.LG},
      year = {2023},
      author = {Zimmer, Max and Andoni, Megi and Spiegel, Christoph and Pokutta, Sebastian},
      title = {PERP: Rethinking the Prune-Retrain Paradigm in the Era of LLMs},
      code = {https://github.com/ZIB-IOL/PERP}
    }
  4. Zimmer, M., Spiegel, C., and Pokutta, S. (2022). Compression-aware Training of Neural Networks Using Frank-Wolfe. [arXiv]
    [BibTeX]
    @misc{zsp_deepsparsefw_22,
      archiveprefix = {arXiv},
      eprint = {2205.11921},
      primaryclass = {cs.LG},
      year = {2022},
      author = {Zimmer, Max and Spiegel, Christoph and Pokutta, Sebastian},
      title = {Compression-aware Training of Neural Networks Using Frank-Wolfe}
    }
  5. Pokutta, S., Spiegel, C., and Zimmer, M. (2020). Deep Neural Network Training with Frank-Wolfe. [arXiv] [summary] [code]
    [BibTeX]
    @misc{zsp_deepfw_20,
      archiveprefix = {arXiv},
      eprint = {2010.07243},
      primaryclass = {cs.LG},
      year = {2020},
      author = {Pokutta, Sebastian and Spiegel, Christoph and Zimmer, Max},
      title = {Deep Neural Network Training with Frank-Wolfe},
      code = {https://github.com/ZIB-IOL/StochasticFrankWolfe},
      summary = {https://pokutta.com/blog/research/2020/11/11/NNFW.html}
    }

Conference proceedings

  1. Pauls, J., Zimmer, M., Kelly, U. M., Schwartz, M., Saatchi, S., Ciais, P., Pokutta, S., Brandt, M., and Gieseke, F. (2024). Estimating Canopy Height at Scale. Proceedings of International Conference on Machine Learning. [arXiv] [code]
    [BibTeX]
    @inproceedings{canopy2024,
      year = {2024},
      booktitle = {Proceedings of International Conference on Machine Learning},
      archiveprefix = {arXiv},
      eprint = {2406.01076},
      primaryclass = {cs.CV},
      author = {Pauls, Jan and Zimmer, Max and Kelly, Una M and Schwartz, Martin and Saatchi, Sassan and Ciais, Philippe and Pokutta, Sebastian and Brandt, Martin and Gieseke, Fabian},
      title = {Estimating Canopy Height at Scale},
      code = {https://github.com/AI4Forest/Global-Canopy-Height-Map}
    }
  2. Mundinger, K., Pokutta, S., Spiegel, C., and Zimmer, M. (2024). Extending the Continuum of Six-Colorings. Proceedings of Discrete Mathematics Days. [URL] [arXiv]
    [BibTeX]
    @inproceedings{mpsz_hadwigernelsonspectrum_24:1,
      year = {2024},
      booktitle = {Proceedings of Discrete Mathematics Days},
      url = {https://dmd2024.web.uah.es/files/abstracts/paper_27.pdf},
      archiveprefix = {arXiv},
      eprint = {2404.05509},
      author = {Mundinger, Konrad and Pokutta, Sebastian and Spiegel, Christoph and Zimmer, Max},
      title = {Extending the Continuum of Six-Colorings}
    }
  3. Wäldchen, S., Sharma, K., Turan, B., Zimmer, M., and Pokutta, S. (2024). Interpretability Guarantees with Merlin-Arthur Classifiers. Proceedings of International Conference on Artificial Intelligence and Statistics. [arXiv]
    [BibTeX]
    @inproceedings{wszp_merlinarthur_22,
      year = {2024},
      booktitle = {Proceedings of International Conference on Artificial Intelligence and Statistics},
      archiveprefix = {arXiv},
      eprint = {2206.00759},
      primaryclass = {cs.LG},
      author = {Wäldchen, Stephan and Sharma, Kartikey and Turan, Berkant and Zimmer, Max and Pokutta, Sebastian},
      title = {Interpretability Guarantees with Merlin-Arthur Classifiers}
    }
  4. Zimmer, M., Spiegel, C., and Pokutta, S. (2024). Sparse Model Soups: A Recipe for Improved Pruning Via Model Averaging. Proceedings of International Conference on Learning Representations. [URL] [arXiv]
    [BibTeX]
    @inproceedings{zsp_modelsoup_23,
      year = {2024},
      booktitle = {Proceedings of International Conference on Learning Representations},
      url = {https://iclr.cc/virtual/2024/poster/17433},
      archiveprefix = {arXiv},
      eprint = {2306.16788},
      primaryclass = {cs.LG},
      author = {Zimmer, Max and Spiegel, Christoph and Pokutta, Sebastian},
      title = {Sparse Model Soups: A Recipe for Improved Pruning Via Model Averaging}
    }
  5. Zimmer, M., Spiegel, C., and Pokutta, S. (2023). How I Learned to Stop Worrying and Love Retraining. Proceedings of International Conference on Learning Representations. [URL] [arXiv] [code]
    [BibTeX]
    @inproceedings{zsp_retrain_21,
      year = {2023},
      booktitle = {Proceedings of International Conference on Learning Representations},
      url = {https://iclr.cc/virtual/2023/poster/10914},
      archiveprefix = {arXiv},
      eprint = {2111.00843},
      primaryclass = {cs.LG},
      author = {Zimmer, Max and Spiegel, Christoph and Pokutta, Sebastian},
      title = {How I Learned to Stop Worrying and Love Retraining},
      code = {https://github.com/ZIB-IOL/BIMP}
    }
  6. Ziemke, T., Sering, L., Vargas Koch, L., Zimmer, M., Nagel, K., and Skutella, M. (2020). Flows Over Time As Continuous Limits of Packet-based Network Simulations. Proceedings of EURO Working Group on Transportation Meeting.
    [BibTeX]
    @inproceedings{zskzns_flows_21:1,
      year = {2020},
      booktitle = {Proceedings of EURO Working Group on Transportation Meeting},
      author = {Ziemke, Theresa and Sering, Leon and Vargas Koch, Laura and Zimmer, Max and Nagel, Kai and Skutella, Martin},
      title = {Flows Over Time As Continuous Limits of Packet-based Network Simulations}
    }

Full articles

  1. Mundinger, K., Pokutta, S., Spiegel, C., and Zimmer, M. (2024). Extending the Continuum of Six-Colorings. Geombinatorics Quarterly, XXXIV. [URL] [arXiv]
    [BibTeX]
    @article{mpsz_hadwigernelsonspectrum_24,
      year = {2024},
      journal = {Geombinatorics Quarterly},
      volume = {XXXIV},
      url = {https://geombina.uccs.edu/past-issues/volume-xxxiv},
      archiveprefix = {arXiv},
      eprint = {2404.05509},
      author = {Mundinger, Konrad and Pokutta, Sebastian and Spiegel, Christoph and Zimmer, Max},
      title = {Extending the Continuum of Six-Colorings}
    }
  2. Ziemke, T., Sering, L., Vargas Koch, L., Zimmer, M., Nagel, K., and Skutella, M. (2021). Flows Over Time As Continuous Limits of Packet-based Network Simulations. Transportation Research Procedia, 52, 123–130. DOI: 10.1016/j.trpro.2021.01.014 [URL]
    [BibTeX]
    @article{zskzns_flows_21,
      year = {2021},
      journal = {Transportation Research Procedia},
      volume = {52},
      pages = {123-130},
      doi = {10.1016/j.trpro.2021.01.014},
      url = {https://sciencedirect.com/science/article/pii/S2352146521000284},
      author = {Ziemke, Theresa and Sering, Leon and Vargas Koch, Laura and Zimmer, Max and Nagel, Kai and Skutella, Martin},
      title = {Flows Over Time As Continuous Limits of Packet-based Network Simulations}
    }

🔬 Projects

AI-Based High-Resolution Forest Monitoring

Preserving forests is crucial for climate adaptation and mitigation. Accurate, up-to-date forest health data is essential. AI4Forest aims to develop advanced AI methods to monitor forests using satellite, radar, and LiDAR data. The project will create scalable techniques for detailed, high-resolution forest maps, updated weekly across Europe and globally.

AI4Forest
Jun 2023 to May 2027
2
3

Research Campus MODAL SynLab

SynLab researches mathematical generalization of application-specific advances achieved in the Gas-, Rail– and MedLab of the research campus MODAL. The focus is on exact methods for solving a broad class of discrete-continuous optimization problems. This requires advanced techniques for structure recognition, consideration of nonlinear restrictions from practice, and the efficient implementation of mathematical algorithms on modern computer architectures. The results are bundled in a professional software package and complemented by a range of high-performance methods for specific applications with a high degree of innovation.

SynLab
Apr 2020 to Mar 2025
12
51

đź’¬ Talks and posters

Poster presentations

May 2023
How I Learned to Stop Worrying and Love Retraining
11th ICLR Conference, Kigali