Aller au contenu. | Aller à la navigation

Outils personnels

Navigation

UMR 5672

logo de l'ENS de Lyon
logo du CNRS
Vous êtes ici : Accueil / Séminaires / Machine Learning and Signal Processing / Low-dimensional representations for Stochastic Gradient Descent

Low-dimensional representations for Stochastic Gradient Descent

Ludovic Stéphan (postdoctoral student in the IdePHICS lab at EPFL)
Quand ? Le 09/05/2023,
de 13:00 à 14:00
Participants Ludovic Stéphan
Ajouter un événement au calendrier vCal
iCal

Speaker: Ludovic Stéphan (postdoctoral student in the IdePHICS lab at EPFL)

Title: Low-dimensional representations for Stochastic Gradient Descent

Abstract: Stochastic Gradient Descent (or SGD) is a simple and efficient training algorithm for neural networks, whose theoretical properties are still not fully understood. In this talk, I will present a possible avenue for studying this algorithm: the existence of low-dimensional, deterministic, approximations to the SGD trajectories. I will present a framework that unifies several such representations, based on statistical physics insights. In particular, this framework allows for such representations to be obtained for any scaling of the network parameters, as long as the labelling function only depends on a few privileged directions.

More information: https://www.lstephan.fr/

Talk in room M7 101 (Campus Monod, ENS de Lyon)