In this thesis we study efficient algorithms for solving very large linear algebra problems. We firstconsider the Kaczmarz method for solving linear systems, and develop a variant that is robust to a
small number of large corruptions, while still requiring only a small working memory. We provide
both theoretical guarantees for certain data distributions as well as empirical results showing that
our approach works well in practice. We then turn our attention to problems of quickly learning
spectral information about a matrix. The first such problem is PSD-testing where we give optimal
query complexity bounds (with respect to types of types of queries) for distinguishing between a
matrix being positive semi-definite versus having a large negative eigenvalue. Building on part of
this work, we then develop optimal sketches for learning the entire spectrum of a matrix to within
additive error. Finally we return our attention to solving linear systems and give new algorithms
that achieve optimal communication complexity for solving least-squares regression problems.