Event Series
Event Type
Seminar
Wednesday, September 25, 2019 4:30 PM
Yiping Lu, Stanford, ICME

Deep learning achieves super success in many areas due to the overparameterization. In this talk, we aim to understand the overparameterization from limiting the neural number to infinity. We take two limits into consideration. The first one is the depth limit. Limiting the depth of deep network to infinity, the neural network can be understood as an ODE and the learning process can be understood as solving a control problem. The ODE understanding bring us new ways to build the model and the control perspective enables us to design optimization algorithm expolits the structure of neural network. Another limit is the infinite wide neural network which recent work has shown that it is equalvent to kernel method, which now known as Neural Tangent Kernel. From this perspective, we explore neural network’s role in distillation and noisy label refinement.