Aller au contenu. | Aller à la navigation

Outils personnels

Navigation

UMR 5672

logo de l'ENS de Lyon
logo du CNRS
Vous êtes ici : Accueil / Séminaires / Machine Learning and Signal Processing / Divergences for Implicit Generative Models.

Divergences for Implicit Generative Models.

Michael Arbel (post-doc à Grenoble, équipe J. Mairal)
Quand ? Le 16/12/2021,
de 11:30 à 12:30
Participants Michael Arbel
Ajouter un événement au calendrier vCal
iCal

Title : Divergences for Implicit Generative Models.

Asbtract :

The talk will focus on integral probability metric (IPM) losses, and f-divergences used for training Implicit Generative Models. I will discuss how they are related to each other and why the critic function they define is so useful in the context of GANs.
We will look at what properties should be satisfied by these losses to make training of GANs more stable. In particular, we will look at the effect of problem-specific critic gradient penalties in GAN optimization and critic regularization.
In a second part, we will see how the information learned by the critic of an f-divergence can be used to define a new model: Generalized Energy Based Models. These models combine two trained components: a base distribution (generally an implicit model), which can learn the support of data with low intrinsic dimension in a high dimensional space; and an energy function (the critic), to refine the probability mass on the learned support. Empirically, the GEBM samples on image-generation tasks are of better quality than those from the learned generator alone, indicating that all else being equal, the GEBM will outperform a GAN of the same complexity. GEBMs also return state-of-the-art performance on density modelling tasks, and when using base measures with an explicit form.

More information :  Michael Arbel (https://michaelarbel.github.io/le lundi 6 Décembre de 10h à 11h. Michael Arbel est post-doctorant à Grenoble chez Julien Mairal et travaille autour des propriétés des différentes divergences entre distributions de probabilité utilisées dans le contexte des modèles génératifs.

Exposé en salle M7 101 (ENS de Lyon, site Monod, 1er étage Recherche)
 

 

Réunion: MLSP_Michael_Arbel et mdp: mlsp2021.