Skip to content. | Skip to navigation

Personal tools

Sections

UMR 5672

logo de l'ENS de Lyon
logo du CNRS
You are here: Home / Seminars / Machine Learning and Signal Processing / Low-rank and sparse tensor decompositions with applications in signal processing

Low-rank and sparse tensor decompositions with applications in signal processing

Cesar CAIAFA (CONICET, UBA)
When Jan 16, 2025
from 01:00 to 02:00
Attendees Cesar CAIAFA
Add event to calendar vCal
iCal

Cesar CAIAFA, Senior research scientist at IAR-CONICET and University of Buenos Aires, Argentina

Title: “Low-rank and sparse tensor decompositions with applications in signal processing”


Abstract: In this talk, I will present low-rank and sparse versions of the Tucker tensor decomposition model, highlighting their applications in signal processing. I will begin with an introduction to tensors and multilinear algebra, followed by a discussion of the Tucker decomposition and its role in achieving efficient data compression and representation. I will demonstrate how tensors can be reconstructed from subsampled data by assuming a low multilinear rank structure. Then, to address the curse of dimensionality in sparse coding for multidimensional data, I will introduce the Sparse Tucker model. Both the low-rank and sparse Tucker models enable the generalization of classical Compressed Sensing (CS) theory from one-dimensional to multidimensional signals. I will illustrate these concepts with examples of linear inverse problems, including tensor completion and the reconstruction of Magnetic Resonance Images (MRI) from compressive multiway projections. Finally, I will present the ENCODE model, a Sparse Tucker decomposition designed to model diffusion MRI measurements, which facilitates brain connectomics studies by achieving substantial data compression and reducing computational costs.