Joint IOL and COGA research seminar

List of talks for the joint weekly seminar with Combinatorial Optimization & Graph Algorithms group from TUB in Fall 2020. See the official homepage for more information and contact Christoph Hertrich for Zoom links and other questions, but please use the contact address below for questions on this page.

We meet every Thursday at 14:00 in room MA 517 at TUB when the pandemic situation allows, otherwise online.

Until further notice the seminar takes place online.

Photo Rico Raber

Rico Raber (COGA)
Coordinates:

Title. Multidimensional Packing under Convex Quadratic Constraints (own results)

Abstract.

We consider a general class of binary packing problems with multiple convex quadratic knapsack constraints. We present three constant-factor approximation algorithms based upon three different algorithmic techniques: (1) a rounding technique tailored to a convex relaxation in conjunction with a non-convex relaxation; (2) a greedy strategy; and (3) a randomized rounding method. The practical performance is tested by a computational study of the empirical approximation of these algorithms for problem instances arising in the context of real-world gas transport networks.

Photo Elias Wirth

Elias Wirth (IOL)
Coordinates:

Title. Learning Relations From Data With Conditional Gradients (own results)

Abstract.

The learning of relations, dynamics, and governing equations from (sampled) data is a prevalent problem in various scientific fields. In the absence of noise, this task can often be solved using the Buchberger-Möller Algorithm, see Möller and Buchberger (1982). In general, however, when we do not have direct access to aforementioned equations, we have access to them indirectly only via noisy sampling. Known methods which extract relations from noisy data are the Approximate Vanishing Ideal Algorithm (AVI) and Vanishing Component Analysis (VCA), see Heldt et al. (2009) and Livni et al. (2013), respectively. These algorithms are often computationally inefficient and only allow for restricted recovery of relations from data. We present a new AVI algorithm and two variants utilizing Conditional Gradient Descent or the Frank-Wolfe Algorithm, see, e.g., Frank and Wolfe (1956) or Levitin and Polyak (1966). Our method is more general, easily regularizable, and can also extract non-homogeneous relations. We then present the efficacy and versatility of our approach for three learning problems: reconstruction of governing equations from data, classification, and regression.

Photo Stephan Wäldchen

Stephan Wäldchen
Coordinates:

Title. Understanding Neural Network Decisions is hard - From Probabilistic Prime Implicants to Arc Bending. (own results)

Abstract.

This talk is a basic overview over my PhD projects. I focus on a project on making neural networks interpretable. If a neural network outputs a classification result for a high-dimensional input, we would like to know whether a small set of the input parameters was already decisive for the classification. This problem is formalised by introducing a probabilistic variant of prime implicants. We show that this problem is hard for the class NPPP and remains NP-hard for any non-trivial approximation. We present a heuristic solution strategy for real-world applications that is build on Assumed Density Filtering: A technique to propagate multivariate normal distributions through neural networks. To justify the use of this heuristic we show that in fact no sensible family of distributions can be invariant under ReLu-neural network layers.

Photo Thomas Kerdreux

Thomas Kerdreux (IOL)
Coordinates:

Title. The artification of the so-called A.I. art and the creative industry (discussion)

Abstract.

In Computer Vision, algorithms are now capable of “intelligent” automatic processing of the images. And these possibilities have been recently recycled in many creative processes by A.I. artists, which outputs sometimes reached hammer prizes in art auctions. This emergence of A.I. art looks like an artification, i.e., a societal process by which a practice becomes commonly acknowledged as an art form. Art history abounds with such examples, e.g., the transitioning from craftmanship to solo painters in the Renaissance. Hence, rather than assessing the artist’s narratives, this point of view allows understanding A.I. art as a figurehead of the broader ecosystem of the creative industry. The latter indeed highly benefited from recent advances in Computer Vision. We discuss here the possible intertwinements between A.I. art, the use of Computer Vision in the next generation of creative software, the creative industry, and broader societal consequences. This is ongoing work with Louis Thiry and Léa Saint-Raymond at Ecole Normale Supérieure in Paris.

Photo Khai Van Tran

Khai Van Tran (TU Berlin)
Coordinates:

Title. Improved Bounds on the Competitive Ratio for Symmetric Rendezvous-on-the-Line with Unknown Initial Distance (own results)

Abstract.

During the COGA Retreat 2019 we started research on the Symmetric Rendezvous-on-the-Line-Problem with Unknown Initial Distance: Two robots are placed on the real line with unknown initial distance. They do not know the location of the other robot and do not necessarily share the same sense of “left” and “right” on the line. They move with constant speed over the line and try to achieve rendez-vous using the same (randomized) search strategy in as little time as possible. In a March 2020 talk (given in the COGA Research Seminar) we presented a family of search strategies that performed well in simulations without being able to determine their exact competitive ratio. During this talk we will present a novel family of search strategies whose precise competitive ratio can be computed using generating functions. This is WIP by Guillaume, Khai Van, Martin, and Max.

Photo Christoph Hertrich

Christoph Hertrich (TU Berlin) [homepage]
Coordinates:

Title. Computing the Maximum Function with ReLU Neural Networks (own results)

Abstract.

This talk deals with an open problem concerning the expressivity of feedforward neural networks (NNs) with rectified linear unit (ReLU) activations. While it has been shown that every continuous, piecewise linear (CPWL) function can be exactly represented by such an NN with logarithmic depth (with respect to the input dimension), it is still unknown whether there exists any CPWL function that cannot be represented with three layers. Based on a yet unproven theoretical assumption, we show a computer-aided proof of non-existence of a 3-layer ReLU NN that computes the maximum of five scalar input numbers. We achieve this partial result by converting the existence question into a linear programming feasibility problem. This is joint work in progress with Amitabh Basu, Marco Di Summa, and Martin Skutella.

Photo Cyrille Combettes

Cyrille Combettes (Georgia Tech & IOL) [homepage]
Coordinates:

Title. Frank–Wolfe with New and Practical Descent Directions (own results)

Abstract.

The Frank-Wolfe algorithm has become a popular constrained optimization algorithm for it is simple and projection-free, and it has been successfully applied to a variety of real-world problems. Its main drawback however lies in its convergence rate, which can be excessively slow due to naive descent directions. In this talk, we provide a brief review and propose two methods significantly improving its performance: one for minimizing a convex objective and one for minimizing a (non)convex finite-sum objective. The first method speeds up the Frank-Wolfe algorithm by better aligning the descent direction with that of the negative gradient via a matching pursuit-style subroutine. Although the idea is reasonably natural, its advantage over the state-of-the-art Frank-Wolfe algorithms is quite considerable. The second method blends stochastic Frank-Wolfe algorithms with adaptive gradients, which set entry-wise step-sizes that automatically adjust to the geometry of the problem. Computational experiments on convex and nonconvex objectives demonstrate the advantage of our approach.