Globally Optimal Neural Network Training completed
Training artificial neural networks is a key optimization task in deep learning. To improve generalization, robustness, and explainability, we aim to compute globally optimal solutions. We will use integer programming methods, exploiting mixed-integer nonlinear programming and enhancing solving techniques like spatial branch-and-cut. Additionally, we'll leverage symmetry to reduce computational burden and ensure symmetry in solutions, and incorporate true sparsity using a mixed-integer nonlinear programming framework.
🧑🎓 Project Members
🪙 Funding
This project was being funded by the DFG SPP 2298/1 - Theoretical Foundations of Deep Learning, itself funded by the German Research Foundation (DFG) under Priority Programme (SPP 2298, project number 441826958) from March 2021 to February 2022.