Invited talk @IEM @EPFL, May 24th 2024

Frugality in machine learning: Sparsity, a value for the future?

Sparse vectors and sparse matrices play a transerve role in signal and image processing: they have led to succesful approaches efficiently addressing tasks as diverse as data compression, fast transforms, signal denoising and source separation, or more generally inverse problems. To what extent can the potential of sparsity be also leveraged to achieve more frugal (deep) learning techniques? Through an overview of recent explorations around this theme, I will compare and contrast classical sparse regularization for inverse problems with its natural extensions that aim at learning neural networks with sparse connections. During our journey, I will notably highlight the role of rescaling-invariances of modern deep parameterizations, which come with their curses and blessings.