January 31 (wednesday) 14.00 - 14.45 Valérie Berthé (Paris VII), "Dendric subshifts and groups" 15.00 - 15.45 Nguyen-Bac Dang, (Orsay) , "Variation of the Hausdorff dimension of limits set and degenerating Schottky groups" 16.15 - 17.00 Bruno Duchesne (Orsay), TBA Valérie Berthé, "Dendric subshifts and groups" We discuss a family of symbolic dynamical systems that have remarkable group properties, the family of dendric words. This family includes numerous classical families of symbolic dynamical systems, among others codings of interval exchanges. Their return words form […]
Optimal transport (OT) has recently gained significant interest in statistics and machine learning. It serves as a natural tool for comparing probability distributions in a geometrically faithful manner. However, OT faces challenges due to the curse of dimensionality, as it may require a sample size that grows exponentially with the dimension. This seminar will be divided into two parts: A tutorial on optimal transport, where I will review the Monge and Kantorovich formulations, and their connection to gradient flow PDEs via the minimizing movement scheme. A more advanced discussion on […]
Le but de l'exposé sera de présenter un théorème d'Adams énonçant quelles sphères peuvent êtres munies d'une structure de groupe topologique, prétexte à introduire l'idée d'invariant-et de leur structure-topologiques (en l’occurrence la K-théorie).
Je parlerai de mes travaux avec Laura Monk, dans lesquels on s'intéresse à la plus petite valeur propre du laplacien sur une surface hyperbolique compacte, choisie aléatoirement selon la mesure de Weil—Petersson.
Consider Z^2, and assign a random length of 1 or 2 to each edge based on independent fair coin tosses. The resulting random geometry, first passage percolation, is conjectured to have a scaling limit. Most random plane geometric models (including hidden geometries) should behave the same. I will explain the basics of the limiting geometry, the "directed landscape", and its relation to traffic jams, tetris, coffee stains and random matrices.
Noah A. Smith (University of Washington) Breaking Down Language Models “Language models are the only thing we have in natural language processing that could be considered scientific.” A collaborator of mine said this more than a decade ago, long before LMs emerged as the single most important technology to come out of our field. In these exciting times, I seek both to make the study of LMs more scientific, and to make LMs more practically beneficial. In this talk, I’ll first draw from recent work from my UW group that starts […]