- Optimal transport; applications and numerics.
- Diffeomorphic flows; machine learning (Neural ODE) and medical image registration (LDDMM) applications.
- Calculus of variations and geometry; applications to shape spaces and fluid flows.
- Applied and computational mathematics.

Complete list of publications here.

Learning maps between data samples is fundamental. Applications range from representation learning, image translation and generative …

PDF
Cite

In this paper, we show that smoothness of solutions of optimal transport can actually be leveraged to define statistical estimators …

PDF
Cite

Comparing metric measure spaces (i.e. a metric space endowed with a probability distribution) is at the heart of many machine learning …

PDF
Cite

Continuous-depth neural networks can be viewed as deep limits of discrete neural networks whose dynamics resemble a discretization of …

PDF
Cite

The squared Wasserstein distance is a natural quantity to compare probability distributions in a non-parametric setting. This quantity …

PDF
Cite

Large deformation diffeomorphic metric mapping (LDDMM) is a popular approach for deformable image registration with nice mathematical …

I explain the formulation of the Wasserstein-Fisher-Rao distance, introduced a few years ago as the natural extension of the Wasserstein L2 metric to the space of positive Radon measures. We present the equivalence between the dynamic formulation and a static formulation, where we relax the marginal constraints with relative entropy. Then, we will present a connection with standard optimal transport on a cone space. The second part of the talk is motivated by this optimal transport distance and we study a generalized Camassa-Holm equation, for which we study existence of minimizing generalized geodesics, a la Brenier.

After presenting some background on optimal transport, we explain why the curse of dimensionality can be encountered in estimating optimal transport distances. we then explain computational techniques for solving optimal transport and in particular entropic regularization. We present Sinkhorn divergences and show how they have been proven to overcome the curse of dimension, but not for approximating optimal transport distances. Then, we show that smoothness of solutions of optimal transport can actually be leveraged to define statistical estimators which are amenable to computation. We use the dual formulation of optimal transport, whose minimizers enjoy a particular structure. These solutions can be written as a sum of squares in Reproducing Kernel Hilbert Spaces. As is standard in the SOS literature, it can be solved using an SDP formulation. By using a soft-penalty in Sobolev spaces on the optimal potential and a trace class positive self-adjoint operator, we can define an estimator of optimal transport which is both statistically and computationally efficient

We show how to break the curse of dimension for the estimation of optimal transport distance between two smooth distributions for the Euclidean squared distance. The approach relies on essentially one tool: represent inequality constraints in the dual formulation of OT by equality constraints with a sum of squares in reproducing kernel Hilbert space. By showing this representation is tight in the variational formulation, one can then leverage smoothness to break the curse.

Member of the research lab LIGM (Laboratoire d’informatique Gaspard Monge) and teaching signal processing.

Member of the research lab Ceremade, UMR CNRS 7534, and teaching applied mathematics.

Member of the Institute for Mathematical Sciences and the Math department.

Optimal transport, Diffeomorphisms and applications to imaging

Hamiltonian formulation of diffeomorphic image matching

Powered by the Academic theme for Hugo.