10:00am - 12:00pmTropical geometry in machine learning
Chair(s): Gregory Naisat (The University of Chicago, United States of America)
A connection between tropical polynomials and neural networks has been recently established. This connection remains to be explored in full. Currently, most basic notions from tropical geometry are used to quantify the number of linear regions in a neural network. Purpose of this session is to present what is currently know about the relationship between tropical polynomials and neural networks and promote further exploration of tropical algebra in the context of machine learning at neural networks.
(25 minutes for each presentation, including questions, followed by a 5-minute break; in case of x<4 talks, the first x slots are used unless indicated otherwise)
Tropical geometry of deep neural networks
Gregory Naisat
The University of Chicago, United States of America
Abstract: We exlore connections between feedforward neural networks with ReLU activation and tropical geometry — we show that the family of such neural networks is equivalent to the family of tropical rational maps. Among other things, we deduce that feedforward ReLU neural networks with one hidden layer can be characterized by zonotopes, which serve as building blocks for deeper networks; we relate decision boundaries of such neural networks to tropical hypersurfaces, a major object of study in tropical geometry; and we prove that linear regions of such neural networks correspond to vertices of polytopes associated with tropical rational functions. An insight from our tropical formulation is that a deeper network is exponentially more expressive than a shallow network.
Tropical geometry and weighted lattices
Petros Maragos
FNational Technical University of Athens
We present advances on extending the max-plus or min-plus algebraic structure of tropical geometry by using weighted lattices and a max-* algebra with an arbitrary binary operation * that distributes over max or min. The envisioned application areas include geometry, image analysis, optimization and learning. Further, we generalize some tropical geometrical objects using weighted lattices. For example, we outline the optimal solution of max-* equations using weighted lattice adjunctions, and apply it to optimal regression for fitting max-* tropical curves on arbitrary data.
A Tropical Approach to Neural Networks with Piecewise Linear Activations
Vasileios Charisopoulos
Cornell University
This talk revisits the problem of counting the regions of linearity of piecewise linear neural networks. We treat layers of neural networks with piecewise linear activations as tropical signomials, which generalize polynomials in the so-called (max, +) or "tropical" algebra to the case of real-valued exponents. Motivated by the discussion in (Montufar et. al, 2014), this approach enables us to recover tight bounds on linear regions of layers with ReLU / leaky ReLU activations, as well as bounds for layers with arbitrary convex, piecewise linear activations.
Our approach crucially relies on exploiting a correspondence between regions of linearity and vertices of Newton polytopes, which also enables us to design a randomized method for counting linear regions in practice. This algorithm relies on sampling vertices and places no restrictions on the range of inputs of the neural network, avoiding the overhead of existing exact approaches which rely on solving a large number of linear or mixed-integer programs. Moreover, it extends beyond rectifier networks.
The results presented in the talk are joint work with Petros Maragos.