Event Series
Event Type
Seminar
Wednesday, February 23, 2022 4:30 PM
Florian Schaefer (Georgia Tech)

In this talk, we develop algorithms for numerical computation, based on ideas from competitive games and statistical inference.

In the first part, we propose competitive gradient descent (CGD) as a natural generalization of gradient descent to saddle point problems and general sum games. Whereas gradient descent minimizes a local linear approximation at each step, CGD uses the Nash equilibrium of a local bilinear approximation. Explicitly accounting for agent-interaction significantly improves the convergence properties, as demonstrated in applications to GANs, reinforcement learning, computer graphics, and physics-informed neural networks.

In the second part, we show that the conditional near-independence properties of smooth Gaussian processes imply the near-sparsity of Cholesky factors of their dense covariance matrices. We use this insight to derive simple, fast solvers with state-of-the-art complexity vs. accuracy guarantees for general elliptic differential and integral equations. Our methods come with rigorous error estimates, are easy to parallelize, and show good performance in practice.