Wednesday, February 8, 2023 12:00 PM
Jin-Peng Liu (Berkeley)

Nonlinear dynamics play a prominent role in many domains and are notoriously difficult to solve. Whereas previous quantum algorithms for general nonlinear equations have been severely limited due to the linearity of quantum mechanics, we gave the first efficient quantum algorithm for nonlinear differential equations with sufficiently strong dissipation. This is an exponential improvement over the best previous quantum algorithms, whose complexity is exponential in the evolution time. We also established a lower bound showing that nonlinear differential equations with sufficiently weak dissipation have worst-case complexity exponential in time, giving an almost tight classification of the quantum complexity of simulating nonlinear dynamics. Furthermore, we design end-to-end quantum machine learning algorithms, combining efficient quantum (stochastic) gradient descent with sparse state preparation and sparse state tomography. We benchmark instances of training sparse ResNet up to 103 million parameters, and identify the dissipative and sparse regime at the early phase of fine-tuning could receive quantum enhancement. Our work shows that fault-tolerant quantum algorithms could potentially contribute to the scalability and sustainability of most state-of-the-art, large-scale machine learning models.  



[1] Liu et al. Efficient quantum algorithm for dissipative nonlinear differential equations, Proceedings of the National Academy of Science 118, 35 (2021), arXiv:2011.03185.

[2] Liu et al. Towards provably efficient quantum algorithms for large-scale machine learning models, in preparation.