Aller au contenu. | Aller à la navigation

Outils personnels

Navigation

UMR 5672

logo de l'ENS de Lyon
logo du CNRS
Vous êtes ici : Accueil / Séminaires / Machine Learning and Signal Processing / Non smooth implicit differentiation

Non smooth implicit differentiation

Tony Silveti-Falls (CentraleSupélec/University of Paris-Saclay)
Quand ? Le 04/04/2023,
de 14:00 à 15:00
Participants Tony Silveti-Falls
Ajouter un événement au calendrier vCal
iCal

Speaker: Tony Silveti-Falls (CentraleSupélec/University of Paris-Saclay)

Title:  Non smooth implicit differentiation

Abstract: In view of training increasingly complex learning architectures, we establish a non-
smooth implicit function theorem with an operational calculus. Our result applies
to most practical problems (i.e., definable problems) provided that a nonsmooth
form of the classical invertibility condition is fulfilled. This approach allows for
formal subdifferentiation: for instance, replacing derivatives by Clarke Jacobians
in the usual differentiation formulas is fully justified for a wide class of nonsmooth
problems. Moreover this calculus is entirely compatible with algorithmic differen-
tiation (e.g., backpropagation). We provide several applications such as training
deep equilibrium networks, training neural nets with conic optimization layers, or
hyperparameter-tuning for nonsmooth Lasso-type models. To show the sharpness
of our assumptions, we present numerical experiments showcasing the extremely
pathological gradient dynamics one can encounter when applying implicit algo-
rithmic differentiation without any hypothesis.

More information: https://tonysf.github.io/

Exposé en salle M7 101 (ENS de Lyon, site Monod, 1er étage côté Recherche au M7)