Abstract: (1) testing if A is positive semidefinite or has minimum eigenvalue sufficiently negative. We give optimal algorithms in the matrix-vector and vector-matrix-vector product models. One of our algorithms implements a new random walk and uses only a single vector-matrix-vector product in each iteration, rather than the usual matrix-vector product in each iteration of classical subspace iteration algorithms.
(2) obtaining additive error estimates to all the eigenvalues of A. Here we give an optimal-sized sketch, which is simply GAG^T for a random Gaussian matrix G. The eigenvalues of GAG^T are not good approximations to the eigenvalues of A; nevertheless we show how to recover estimates to the eigenvalues of A from the sketch.
This is based on joint works with Deanna Needell and William Swartworth (FOCS, 2022), and William Swartworth (STOC, 2023).
Bio: David Woodruff is a professor at Carnegie Mellon University in the Computer Science Department. Before that he was a research scientist at the IBM Almaden Research Center, which he joined in 2007 after completing his Ph.D. at MIT in theoretical computer science. His research interests include data stream algorithms, distributed algorithms, machine learning, numerical linear algebra, optimization, sketching, and sparse recovery. He is the recipient of the 2020 Simons Investigator Award, the 2014 Presburger Award, and Best Paper Awards at STOC 2013, PODS 2010, and PODS, 2020. At IBM he was a member of the Academy of Technology and a Master Inventor.