Skip to content. | Skip to navigation

Personal tools

Sections

UMR 5672

logo de l'ENS de Lyon
logo du CNRS
You are here: Home / Seminars / Machine Learning and Signal Processing / Low-dimensional representations for Stochastic Gradient Descent

Low-dimensional representations for Stochastic Gradient Descent

Ludovic Stéphan (postdoctoral student in the IdePHICS lab at EPFL)
When May 09, 2023
from 01:00 to 02:00
Attendees Ludovic Stéphan
Add event to calendar vCal
iCal

Speaker: Ludovic Stéphan (postdoctoral student in the IdePHICS lab at EPFL)

Title: Low-dimensional representations for Stochastic Gradient Descent

Abstract: Stochastic Gradient Descent (or SGD) is a simple and efficient training algorithm for neural networks, whose theoretical properties are still not fully understood. In this talk, I will present a possible avenue for studying this algorithm: the existence of low-dimensional, deterministic, approximations to the SGD trajectories. I will present a framework that unifies several such representations, based on statistical physics insights. In particular, this framework allows for such representations to be obtained for any scaling of the network parameters, as long as the labelling function only depends on a few privileged directions.

More information: https://www.lstephan.fr/

Talk in room M7 101 (Campus Monod, ENS de Lyon)