Generative Modeling

  1. Demystifying MMD GANs
    A Maximum Mean Discrepancy loss for Generative Adversarial Networks. ICLR 2018
  2. On gradient regularizers for MMD-GANs
    A gradient penalty method for regularizing MMD-GANs with state-of-the-art performance. NeurIPS 2018
  3. Generalized energy based models
    A model combining implicit and explit modeling for learning distributions with small intrinsic dimension. ICLR 2021

High Dimensioncal Sampling

  1. Annealed Flow Transport Monte Carlo
    A sampling algorithm combining Sequential Monte Carlo and Normalizing Flows. ICML 2021

Kernel Methods

  1. Kernel Conditional Exponential Family
    An algorithm for conditional density estimation using Kernel Exponential Families. AISTATS 2018
  2. Efficient and principled score estimation with Nystrom kernel exponential families
    An efficient algorithm for density estimation. AISTATS 2018
  3. Patches in Deep Convolutional Kernels Methods
    Code to reproduce the experiments of the paper. ICLR 2021


  1. Maximum Mean Discrepancy Gradient Flow
    An algorithm based on Noise Injection to optimize the corresponding paper. NeurIPS 2019
  2. Kernelized Wasserstein Natural Gradient
    An efficient and scalable algorithm for estimating the Wasserstein natural gradient. ICLR 2020
  3. Synchronizing Probability Measures on Rotations via Optimal Transport
    A simple algorithm based on optimal transport for synchronizing rotations. CVPR 2020

Reinforcement Learning

  1. Efficient wasserstein natural gradients for reinforcement learning
    An efficient algorithm to incorporate behavioral geometry for policy gradient in RL. ICLR 2021
  2. Tactical Optimism and Pessimism for Deep Reinforcement Learning
    A simple algorithm for dynamically trading-off between optimism and pessimism in RL. NeurIPS 2021