CS Machine Learning Seminar: Feature learning via gradient descent

Tuesday, October 4, 2022
3:30 p.m.
Online presentation
Soheil Feizi
sfeizi@umd.edu

Feature learning via gradient descent beyond the NTK/lazy regime and deep learning for inverse problems

Mahdi Soltanolkotabi
Director, Center for AI Foundations for the Sciences (AIF4S)
University of Southern California

Zoom link: https://umd.zoom.us/j/95197245230?pwd=cDRlVWRVeXBHcURGQkptSHpIS0VGdz09

Password: 828w

In the first part of the talk, I will focus on demystifying the generalization and feature learning capability of modern overparameterized neural networks. Our result is based on an intriguing spiking phenomena for gradient descent, that puts the iterations on a particular trajectory towards solutions that are not only globally optimal but also generalize well. Notably this analysis overcomes a major theoretical bottleneck in the existing literature and goes beyond the “lazy” or “NTK” training regime which requires unrealistic hyperparameter choices (e.g. very small step sizes, large initialization or wide models).In the second part of the talk, I will discuss the challenges and opportunities of using AI for computational imaging and scientific applications more broadly. Specifically, I will discuss an emerging literature on deep learning for inverse problems that have been very successful for a variety of image and signal recovery and restoration tasks. In particular, for medical imaging reconstruction I will discuss our work on designing new architectures that lead to state of the art performance and report on techniques to significantly reduce the required data for training.

Papers
Small random initialization is akin to spectral learning: Optimization and generalization guarantees for overparameterized low-rank matrix reconstruction. D. Stoger and M. Soltanolkotabi. (NeuRIPS 2021). https://arxiv.org/abs/2106.15013

Neural networks can Learn Representations with Gradient Descent. Alex Damian, Jason D. Lee, and Mahdi Soltanolkotabi COLT 2022 https://arxiv.org/pdf/2206.15144.pdf.

HUMUS-NET: Hybrid unrolled multi-scale network architecture for accelerated MRI reconstruction. Z. Fabian, B. Tinaz and M. Soltanolkotabi. NeuRIPS 2022. https://arxiv.org/abs/2203.08213

Data augmentation for deep learning based accelerated MRI reconstruction with limited data. Z. Fabian, R. Heckel, and M. Soltanolkotabi. ICML 2021. https://arxiv.org/abs/2203.08213

Biography
Mahdi Soltanolkotabi is the director of the center for AI Foundations for the Sciences (AIF4S) at USC. He is also an associate professor in the departments of Electrical and Computer Engineering, Computer Science, and Industrial and Systems Engineering where he holds an Andrew and Erna Viterbi Early Career Chair. Prior to joining USC, he completed his PhD in electrical engineering at Stanford in 2014. He was a postdoctoral researcher in the EECS department at UC Berkeley during the 2014-2015 academic year. His research focuses on developing the mathematical foundations of modern data science via characterizing the behavior and pitfalls of contemporary nonconvex learning and optimization algorithms with applications in deep learning, large scale distributed training, federated learning, computational imaging, and AI for scientific applications. Mahdi is the recipient of the Information Theory Society Best Paper Award, Packard Fellowship in Science and Engineering, a Sloan Research Fellowship in mathematics, an NSF Career award, an Airforce Office of Research Young Investigator award (AFOSR-YIP), the Viterbi school of engineering junior faculty research award, and faculty research awards from Google and Amazon.

remind we with google calendar

 

March 2024

SU MO TU WE TH FR SA
25 26 27 28 29 1 2
3 4 5 6 7 8 9
10 11 12 13 14 15 16
17 18 19 20 21 22 23
24 25 26 27 28 29 30
31 1 2 3 4 5 6
Submit an Event