Nico Pelleriti

📬 Contact

office
Room 3024 at ZIB
e-mail
homepage
pelleriti.org

🎓 Curriculum vitae

since 2025
Researcher at TUB
since 2024
Member of BMS
since 2023
Research Assistant at ZIB
Aug 2024
B.Sc. in Mathematics at TUB
Sep 2025
M.Sc. in Mathematics at TUB

📝 Publications and preprints

Preprints

  1. Pelleriti, N., Spiegel, C., Liu, S., MartĂ­nez-Rubio, D., Zimmer, M., and Pokutta, S. (2025). Neural Sum-of-Squares: Certifying the Nonnegativity of Polynomials with Transformers. [arXiv]
    [BibTeX]
    @misc{2025_PelleritiEtAl_Neuralsos_2510-13444,
      archiveprefix = {arXiv},
      eprint = {2510.13444},
      arxiv = {arXiv:2510.13444},
      primaryclass = {cs.LG},
      year = {2025},
      author = {Pelleriti, Nico and Spiegel, Christoph and Liu, Shiwei and MartĂ­nez-Rubio, David and Zimmer, Max and Pokutta, Sebastian},
      title = {Neural Sum-of-Squares: Certifying the Nonnegativity of Polynomials with Transformers},
      date = {2025-10-15}
    }

Conference proceedings

  1. Kera, H., Pelleriti, N., Ishihara, Y., Zimmer, M., and Pokutta, S. (2025). Computational Algebra with Attention: Transformer Oracles for Border Basis Algorithms. Proceedings of the Conference on Neural Information Processing Systems, 38. DOI: 10.48550/arXiv.2505.23696 [arXiv]
    [BibTeX]
    @inproceedings{2025_KeraEtAl_Transformeroracles_2505-23696,
      year = {2025},
      booktitle = {Proceedings of the Conference on Neural Information Processing Systems},
      month = sep,
      volume = {38},
      doi = {10.48550/arXiv.2505.23696},
      archiveprefix = {arXiv},
      eprint = {2505.23696},
      arxiv = {arXiv:2505.23696},
      primaryclass = {cs.LG},
      author = {Kera, Hiroshi and Pelleriti, Nico and Ishihara, Yuki and Zimmer, Max and Pokutta, Sebastian},
      title = {Computational Algebra with Attention: Transformer Oracles for Border Basis Algorithms},
      date = {2025-05-29}
    }
  2. Pelleriti, N., Zimmer, M., Wirth, E., and Pokutta, S. (2025). Approximating Latent Manifolds in Neural Networks Via Vanishing Ideals. Proceedings of the International Conference on Machine Learning, 267. DOI: 10.48550/arXiv.2502.15051 [arXiv]
    [BibTeX]
    @inproceedings{2025_PelleritiZimmerWirthPokutta_Latentmanifolds,
      year = {2025},
      booktitle = {Proceedings of the International Conference on Machine Learning},
      month = may,
      volume = {267},
      doi = {10.48550/arXiv.2502.15051},
      archiveprefix = {arXiv},
      eprint = {2502.15051},
      arxiv = {arXiv:2502.15051},
      primaryclass = {cs.LG},
      author = {Pelleriti, Nico and Zimmer, Max and Wirth, Elias and Pokutta, Sebastian},
      title = {Approximating Latent Manifolds in Neural Networks Via Vanishing Ideals},
      date = {2025-02-20}
    }

🔬 Projects

Agent AI in Mathematics

This project aims to develop agentic AI systems for mathematical research: autonomous discovery of patterns and conjectures, design and execution of computational experiments, and integration with formal verification tools.

MATH+ EF-LiOpt-3
Sep 2025 to Aug 2028
3

đź’¬ Talks and posters

Poster presentations

Dec 2025
Computational Algebra with Attention: Transformer Oracles for Border Basis Algorithms
39th conference on neural information processing systems (NeurIPS), San Diego
Jul 2025
Approximating Latent Manifolds in Neural Networks Via Vanishing Ideals
42nd International Conference on Machine Learning (ICML), Vancouver

đź“… Event Attendance