Diffusion-based generative models treat samples as independent and memoryless. I will show that relaxing each assumption leads to rich, exactly solvable physics — with no neural networks anywhere.
Giving samples a present — coupling them through their evolving mean field — produces a McKean–Vlasov optimal transport problem whose self-consistent guidance is provably the linear interpolant between endpoint means, for arbitrary distributions and any interaction schedule; applied to building-fleet demand response, this saves 20%+ in actuation energy.
Giving samples a past produces a continual-learning agent whose memory is a Bridge Diffusion and whose forgetting — arising from a single lossy temporal coarse-graining step — obeys a universal linear capacity law with a Shannon-like constant.
Both constructions live in the world of Riccati equations, hyperbolic functions, and mixture linear algebra; the physics of the bridge — not the expressivity of a network — controls what is achievable.
These seminars are being made possible through the support of the CFM-ENS Chair « Modèles et Sciences des Données ».
The organizers: Giulio Biroli, Alex Cayco Gajic, Bruno Loureiro, Stéphane Mallat, Gabriel Peyré.