Manifold optimization has found wide applications across various scientific and engineering domains. In this talk, I will present our recently developed algorithms for large-scale decentralized and federated manifold optimization. In addition, I will present a retraction-free and penalty parameter-free gradient method for solving optimization problems over the Stiefel manifold and demonstrate its potential for fine-tuning large language models.
Bio: Jiang Hu is currently a Postdoctoral Scholar at University of California, Berkeley. Before that, He received the Ph.D. degree in numerical optimization from Peking University, and then worked as postdoc fellows in the Chinese University of Hong Kong and Harvard Medical School. His research focuses on mathematical optimization, particularly in areas such as manifold optimization, nonsmooth optimization, and stochastic/decentralized/federated optimization, with applications in computational sciences and machine learning. He received the Best Paper Award at the 2024 IEEE International Conference on Acoustics, Speech, and Signal Processing.