Posts

In a couple of recent papers on applied optimal transport (OT), we used the fact that the so-called semi-dual formulation of OT …

In a project with a student, I recently derived a non-local diffusion PDE which turns out, surprisingly for me, to have a name: Stein …

Recent preprints and publications

Complete list of publications here.

An acceleration of the Sinkhorn algorithm for unbalanced transport is proposed and Frank-Wolfe to solve the 1D unbalanced OT problem.

Unbalanced OT inherits regularity of OT via a bootstrap argument. Polar factorization on a semi-direct product of group is also proven.

How to select optimal transport (OT) maps in OT for ML? We give a first answer using the semi-dual formulation of OT.

We prove a local Polyak-Lojasiewicz condition for a (simplified) class of deep ResNets and global convergence via overparameterization.

We are able to refine our previous work on RKHS-SOS for OT to get (almost) minimax rate of estimation of the optimal potentials.

Inverse consistency + neural networks exhibit smoothness as an implicit bias

Talks

We show how to break the curse of dimension for the estimation of optimal transport distance between two smooth distributions for the Euclidean squared distance. The approach relies on essentially one tool: represent inequality constraints in the dual formulation of OT by equality constraints with a sum of squares in reproducing kernel Hilbert space. By showing this representation is tight in the variational formulation, one can then leverage smoothness to break the curse.

After presenting some background on optimal transport, we present entropic regularization, its link with the Schr”odinger problem and the Sinkhorn algorithm. We present unbalanced optimal transport and the corresponding Sinkhorn algorithm. Then, we show how to improve on the vanilla Sinkhorn algorithm in this particular case. Then, we switch to a different problem which is the statistical estimation of optimal transport. Under smoothness assumptions on the transport maps we achieve a parametric rate of estimation of the distance using a sum-of-squares in Sobolev spaces.

I explain the formulation of the Wasserstein-Fisher-Rao distance, introduced a few years ago as the natural extension of the Wasserstein L2 metric to the space of positive Radon measures. We present the equivalence between the dynamic formulation and a static formulation, where we relax the marginal constraints with relative entropy. Then, we will present a connection with standard optimal transport on a cone space. The second part of the talk is motivated by this optimal transport distance and we study a generalized Camassa-Holm equation, for which we study existence of minimizing generalized geodesics, a la Brenier.

Contact

Experience

 
 
 
 
 

Professor

Université Gustave Eiffel

Sep 2018 – Present Noisy-Champs
Member of the research lab LIGM (Laboratoire d’informatique Gaspard Monge) and teaching signal processing.
 
 
 
 
 

Assistant professor

University Paris-Dauphine

Sep 2011 – Sep 2018 Paris
Member of the research lab Ceremade, UMR CNRS 7534, and teaching applied mathematics.
 
 
 
 
 

Research assistant

Imperial College

May 2009 – Sep 2011 London
Member of the Institute for Mathematical Sciences and the Math department.
 
 
 
 
 

PhD candidate

ENS Cachan (ENS Paris-Saclay)

Sep 2005 – May 2009 Paris area
Member of the lab CMLA.

PhD and HDR

Habilitation à diriger les recherches

Optimal transport, Diffeomorphisms and applications to imaging

PhD in applied mathematics

Hamiltonian formulation of diffeomorphic image matching