The colloquium of the Center for Mathematical Sciences, University of Lund, normally runs once a month. It is aimed at the entire Centre for Mathematical Sciences with overview talks by renowned experts about exciting mathematical topics.
Talks are usually scheduled for Wednesdays 15.15 - 16.15 in the Hörmander lecture hall placed on the ground floor of the Mathematics building. The room is equipped with 6 large blackboards and two projectors.
The purpose of our colloquium is twofold: firstly, it is to provide an inspiring overview of a specific field of mathematics, secondly, it is to bring together students and staff from the entire department and to serve as the proverbial waterhole where contacts are made and maintained.
The colloquium is organized by Yacin Ameur, Dragi Anevski, Magnus Goffeng, Tony Stillfjord and Karl Åström. Feel free to contact any one of us for questions or suggestions for colloquia speakers.
Colloquia, Spring 2023
Klas Modin (Chalmers and University of Gothenburg)
Title: A Brief History of Geometric Mechanics
In this talk I wish to give an overview of the tight connection between differential geometry and classical mechanics. I shall take a historical route, going from Newton and Euler, via Lagrange and Hamilton, Poincaré and the Prize of King Oscar II, to Arnold and the Riemannian description of hydrodynamics.
Alain Valette (University of Neuchâtel)
Title: The Wasserstein distance on metric trees
Let (X,d) be a metric space. The Wasserstein distance, or earthmover distance, is a metric on the space of probability measures on X, that originates from optimal transportation: if \mu,\nu are probability measures on X, the Wasserstein distance between \mu and \nu intuitively represents the minimal amount of work necessary to transform \mu into \nu. Since the definition of the Wasserstein distance involves an infimum, it is not expected in general that there is a closed formula for this distance. Such a closed formula however exists for metric trees (i.e. combinatorial trees where the length of an edge can be any positive real number), and this closed formula has an interesting history that makes it suitable for a colloquium talk: it appeared first in computer science papers (Charikar 2002), then surfaced again in bio-mathematics (Evans-Matsen 2012), before catching the interest of pure mathematicians. In joint work with M. Mathey-Prévôt, we advocate that the right framework for this closed formula is real trees (i.e. geodesic metric spaces with the property that any two points are connected by a unique arc); we give two proofs of the closed formula, one algorithmic, the other one connecting with Lipschitz-free spaces from Banach space theory.
Jeff Steif (Chalmers and University of Gothenburg)
Title: Boolean Functions, Critical Percolation and noise sensitivity
I will introduce and discuss the notion of noise sensitivity for Boolean functions, which captures the idea that certain events are very sensitive to small perturbations. While a few examples will be given, the main example which we will examine from this perspective is so-called 2-dimensional critical percolation from statistical mechanics. There will also be some connections to combinatorics and theoretical computer science. The mathematics behind the story includes, among other things, Fourier analysis on the hypercube. No background concerning percolation or Fourier analysis will be assumed.
Steen Markvorsen (Technical University of Denmark)
Title: A view towards some applications of Riemann-Finsler geometry
In this talk I will comment on some Finsler-type geometric key phenomena that arise naturally in such otherwise disparate fields and topics as: Seismology; dMRI (diffusion Magnetic Resonance Imaging); Wildfire Modelling; the metric of Colour Space; and (if time permits) Riemann-Finsler Conductivity.
Josephine Sullivan (KTH)
Title: Are All the Linear Regions of a ReLU Network Created Equal?
This presentation will describe recent research described in the paper "Are All Linear Regions Created Equal?" published at AISTATS 2022. The function represented by most ReLU neural networks is piecewise affine. The number of linear regions defined by this function has been used as a proxy for the network's complexity. However, much empirical work suggests, especially in the overparametrized setting, the number of linear regions does not capture the effective non-linearity of the network's learnt function. We propose an efficient algorithm for discovering linear regions and use it to investigate the effectiveness of density in capturing the nonlinearity of trained VGGs and ResNets on CIFAR-10 and CIFAR-100. We contrast the results with a more principled nonlinearity measure based on function variation, highlighting the shortcomings of linear regions density. Furthermore, interestingly, our measure of nonlinearity clearly correlates with model-wise deep double descent, connecting reduced test error with reduced nonlinearity, and increased local similarity of linear regions.