LES HOUCHES SCHOOL OF PHYSICS SINCE 1951
Summer school on Statistical Physics of Machine learning
Organisers: Florent Krzakala and Lenka Zdeborova, EPFL
Overview:
The school is aimed primarily at the growing audience of theoretical physicists, applied mathematicians, computer scientists and colleagues from other computational fields interested in machine learning, neural networks, and high-dimensional data analysis. We shall cover basics and frontiers of high-dimensional statistics, machine learning, theory of computing and statistical learning, and the related mathematics and probability theory. We will put a special focus on methods of statistical physics and their results in the context of current questions and theories related to these problems. Open questions and directions will be discussed as well.
Teachers:
- Francis Bach (Inria, ENS) : Sums-of-squares: from polynomials to kernels [lectures notes][videos]
- Yasaman Bahri (Google) and Boris Hanin (Princeton) : Deep Learning at Large and Infinite Width [lectures notes][videos]
- Boaz Barak (Harvard) : Computational Complexity of Deep learning: Fundamental limitations and Empirical phenomena [videos]
- Giulio Biroli (ENS Paris) : High-Dimensional Non-Convex Landscapes and Gradient Descent Dynamics [lectures notes][videos]
- Michael I. Jordan (Berkeley) On decision, dynamics, incentives, and mechanisms design [lectures notes][videos]
- Julia Kempe (NYU): Data, Physics and Kernels and how can (statistical) physics tools help the DL practitioner slides[1], [lectures notes][videos]
- Yann LeCun (Facebook & NYU) : From machine learning to autonomous intelligence slides: [1], [2], [3], [lectures notes][videos]
- Marc Mézard (ENS Paris) : Belief propagation, Message-Passing & Sparse models [lectures notes]
- Remi Monasson (ENS Paris) : Replica method for computational problems with randomness: principles and illustrations Slides [1], [2], ([videos]
- Andrea Montanari (Stanford) : Neural networks from a nonparametric viewpoint [lectures notes][videos]
- Sara Solla (Northwestern Univ.): Statistical Physics, Bayesian Inference and Neural Information Processing Slides [1], [2],[3], [lectures notes][videos]
- Haim Sompolinsky (Harvard & Hebrew Univ.): Statistical Mechanics of Machine Learning [videos]
- Nathan Srebro (TTI Chicago) Applying statistical learning theory to deep learning Slides [1], [2], [3], [4], [5], [6], [lectures notes][videos]
- Eric Vanden-Eijnden (NYU Courant) : Benefits of overparametrization in statistical learning, & Enhancing MCMC Sampling with Learning [lectutes notes][videos]
- Matthieu Wyart (EPFL) : Loss landscape, over-parametrization and curse of dimensionality in deep learning Slides [1][videos]
Topics:
- Phase transition in machine learning
- Computational complexity
- Dynamics of learning in high-dimension
- Message passing algorithms
- Challenges in machine learning
- Statistical physics of deep neural networks
- High-dimensional statistics
- Optimization and implicit regularisation
- Replica and cavity methods
- Probability theory and rigorous approaches
- Statistical inference in high-dimension
- Computational learning theory
Participants:
Find here the list of participants to the school.
Click here for the poster of the conference.