Colloquium / Seminars
Topic:A Swiss-Army Knife for the Theory of Deep Learning and Beyond
Speaker:Mr. Greg Yang
(Microsoft Research)Date time:Dec. 25, 2019 14:00
Venue:SA223
Abstract:
Abstract. The resurgence of neural networks has revolutionized artificial intelligence since 2010. Luckily for mathematicians and statistical physicists, the study of large random network scaling limits, which can be thought of as *nonlinear* random matrix theory, is both practically important and mathematically interesting. We describe several problems in this setting and develop a new comprehensive framework, called “tensor programs”, for solving these problems. This framework can be thought of as an automatic tool to derive the behavior of computation graphs with large matrices, as used in neural network computation. It is very general, and from it we also obtain new proofs of the semicircle and the Marchenko-Pastur laws. From this framework follows many insights on neural networks, such as the limit of wide neural networks to Gaussian processes. This talk presents the works arXiv:1902.04760 and arXiv:1910.12478.
go back