Designed and built with care, filled with creative elements

Top

A critical drift-diffusion equation: intermittent behavior

ENS - salle W 45 rue d'Ulm, Paris, France

This talk is about a simple but rich model problem at the cross section of stochastic homogenization and singular stochastic PDE: We consider a drift-diffusion process with a time-independent and divergence-free random drift that is of white-noise character. As already realized in the physics literature, the critical case of two space dimensions is most interesting: The elliptic generator requires a small-scale cut-off for well- posedness, and one expects marginally super-diffusive behavior on large scales. I will explain the criticality of the two-dimensional case in the introductory course by scaling arguments. […]

Le modèle de dimères en mécanique statistique

amphi Galois NIR

Le modèle de dimères représente la répartition de molécules di-atomiques sur la surface d'un cristal. Il appartient à la grande famille des modèles de mécanique statistique définis sur les graphes, dont d'autres représentant célèbres sont le modèle d'Ising et la percolation.   Après une introduction générale, nous nous intéresserons aux résultats fondateurs du modèle de dimères lorsque le graphe sous-jacent est fini, en particulier au théorème de Kasteleyn qui prouve une formule close pour le nombre de configurations de dimères.

ENS-Data Science colloquium – Lénaïc Chizat (EPFL)

Amphi Jaurès (29 Rue d'Ulm)

04 Avril 2024, Lénaïc Chizat (EPFL) Title: A Formula for Feature Learning in Large Neural Networks Abstract: Deep learning succeeds by doing hierarchical feature learning, but tuning hyperparameters such as initialization scales, learning rates, etc., only give indirect control over this behavior. This calls for theoretical tools to predict, measure and control feature learning. In this talk, we will first review various theoretical advances (signal propagation, infinite width dynamics, etc) that have led to a better understanding of the subtle impact of hyperparameters and architectural choices on the training dynamics. We will then introduce […]

Group theory seminar Jaikin-Zapirain/Marquis/Kharlampovich

14:00-17:00 Salle W

April 16 (Tuesday) For this seminar we will have the pleasure of listening to three talks: 14.00 - 14.45 Andrei Jaikin-Zapirain (UAM), "Compressed subgroups in free groups are inert" 15.00 - 15.45 Timothée Marquis (UCL), "Amalgams of rational unipotent groups and residual nilpotence". 16.15 - 17.00 Olga Kharlampovich (CUNY), "Quantification of separability of cubically convex-cocompact subgroups of RAAGs via representations".  Andrei Jaikin-Zapirain  "Compressed subgroups in free groups are inert".   Let F be a free group. A finitely generated subgroup H is called compressed in F if it is not contained […]

CANCELED – Camillo De Lellis – CANCELED

Jussieu -- salle 15-16-309 4 Place Jussieu, Paris, France

Mini-course: Flows of nonsmooth vector fields Consider a vector field v on the Euclidean space. The classical Cauchy-Lipschitz (also named Picard-Lindelöf) Theorem states that, if the vector field is Lipschitz in space, for every initial datum x there is a unique trajectory γ starting at x at time 0 and solving the ODE γ'(t) = v(t, γ(t)). The theorem looses its validity as soon as v is slightly less regular. However, if we bundle all trajectories into a global map allowing x to vary, a celebrated theory started by DiPerna […]

ENS-Data Science colloquium – Michael Jordan

Amphi Jaurès (29 Rue d'Ulm)

Michael Jordan (UC Berkeley and INRIA Paris) Collaborative Learning, Information Asymmetries, and Incentives This colloquium is organized around data sciences in a broad sense, with the goal of bringing together researchers with diverse backgrounds (including mathematics, computer science, physics, chemistry and neuroscience) but a common interest in dealing with complex, large scale, or high dimensional data. More information can be found on the web page of the seminar: https://data-ens.github.io/seminar/