𝜱ΔΓΞΨβαΘπξ

PhDrink

Where: Abscint

Wednesday 16 April 2025 from 16:00 until 18:00

Participants: 17

Free

Download iCal file

Organized by: MaRCo

Come hear about the research carried out by AM PhD students at the UT.

 

Alexander Wierzba will talk about infinite dimensions, unbounded operators and what this all has to do with music, while Insung Kong will talk about minimax optimality of deep neural networks on regression problems.

 

Title: Infinite dimensions, unbounded operators and what this all has to do with music

Abstract: When mathematically modeling complex physical systems one is bound to encounter partial differential equations (PDEs). These can be used to describe phenomena such as vibrations, heat conduction or fluid flows. The language and tools to analyze such equations from a system theoretical viewpoint are then provided by functional analysis and operator theory.

 This talk will give a basic introduction to the realm of infinite-dimensional vector spaces and the concept of unbounded operators. We will see how such operators arise naturally from PDEs, how they can differ from operators in finite-dimensions and even how they can be used to understand how instruments make music.

 

TitleMinimax optimality of deep neural networks on regression problems

Abstract: As deep neural networks have become widely used in machine learning, there have been many attempts to understand why they perform well. In this talk, we aim to explore the research that has been carried out from the perspective of statistical learning theory.

We first introduce the concept of `minimax optimality,' a key notion in statistical learning theory. Then, we explore results on the minimax optimality of deep neural networks for regression problems. In all three different settings—the smooth function assumption, the manifold assumption, and the hierarchical structure assumption—deep neural networks achieve minimax optimality.