Search Results for author: Qianyi Li

Found 4 papers, 1 papers with code

Connecting NTK and NNGP: A Unified Theoretical Framework for Neural Network Learning Dynamics in the Kernel Regime

no code implementations8 Sep 2023 Yehonatan Avidan, Qianyi Li, Haim Sompolinsky

In this regime, two disparate theoretical frameworks have been used, in which the network's output is described using kernels: one framework is based on the Neural Tangent Kernel (NTK) which assumes linearized gradient descent dynamics, while the Neural Network Gaussian Process (NNGP) kernel assumes a Bayesian framework.

Globally Gated Deep Linear Networks

no code implementations31 Oct 2022 Qianyi Li, Haim Sompolinsky

The rich and diverse behavior of the GGDLNs suggests that they are helpful analytically tractable models of learning single and multiple tasks, in finite-width nonlinear deep networks.

L2 Regularization

Statistical Mechanics of Deep Linear Neural Networks: The Back-Propagating Kernel Renormalization

no code implementations7 Dec 2020 Qianyi Li, Haim Sompolinsky

This procedure allows us to evaluate important network properties, such as its generalization error, the role of network width and depth, the impact of the size of the training set, and the effects of weight regularization and learning stochasticity.

Cannot find the paper you are looking for? You can Submit a new open access paper.