Apr 9–11, 2025
Perimeter Institute for Theoretical Physics
America/Toronto timezone

Scaling Limits for Learning: Dynamics and Statics

Apr 9, 2025, 9:45 a.m.
45m
PI/4-400 - Space Room (Perimeter Institute for Theoretical Physics)

PI/4-400 - Space Room

Perimeter Institute for Theoretical Physics

48
Workshop Talk

Speaker

Blake Bordelon (Harvard University)

Description

In this talk, I will discuss how physics can help improve our understanding of deep learning systems and guide improvements to their scaling strategies. I will first discuss mathematical results based on mean-field techniques from statistical physics to analyze the feature learning dynamics of neural networks as well as posteriors of large Bayesian neural networks. This theory will provide insights to develop initialization and optimization schemes for neural networks that admit well defined infinite width and depth limits and behave consistently across model scales, providing practical advantages. These limits also enable a theoretical characterization of the types of learned solutions reached by deep networks, and provide a starting point to characterize generalization and neural scaling laws (see Cengiz Pehlevan's talk).

Presentation materials

There are no materials yet.

External references