Dec 2 – 4, 2024
SISSA
Europe/Rome timezone

Sebastian Goldt -- Elements of a modern theory for deep neural networks

Dec 3, 2024, 12:40 PM
55m
Room 128-129 (1st floor) (SISSA)

Room 128-129 (1st floor)

SISSA

Description

Deep neural networks are complex functions composed of a large number of simple units called neurons. Their remarkable success in machine learning, where they excel in high-dimensional problems when they have a large number of parameters, has challenged classical theories of learning. Understanding how learning emerges from the interaction of millions of neurons presents a deep theoretical challenge for mathematicians, physicists and statisticians. In this talk, I will give a brief introduction into what neural networks are and how they learn. Then I will argue that a modern theory of deep learning will have to explain how learning emerges from the interplay of network architecture, the learning algorithm, and the structure of the training data.

Presentation materials