Event Series
Event Type
Seminar
Monday, April 1, 2024 4:00 PM
Benjamin McKenna (Harvard)

In recent years, machine learning has motivated the study of what one might call "nonlinear random matrices." This broad term includes various random matrices whose construction involves the entrywise application of some deterministic nonlinear function, such as ReLU. We study one such model, an entrywise nonlinear function of a sample covariance matrix f(X·X), typically called a "random inner-product kernel matrix" in the literature. A priori, entrywise modifications of a matrix can affect the eigenvalues in complicated ways. However, a long line of work has established that the global spectrum of such matrices actually behaves in a simple way, described by free probability, when either the randomness in X is from a restricted family (such as Gaussian) or X has proportional sidelengths. We show that this description is universal, holding both for much more general randomness in X and for nonclassical aspect ratios.

This is joint work with Sofiia Dubova, Yue M. Lu, and Horng-Tzer Yau.