This dissertation focuses on the development, implementation, and analysis of fast algorithms for the Minimum Sobolev norm (MSN). The MSN method obtains a unique solution from an underdetermined linear system by minimizing a derivative norm in the appropriate Hilbert space. We obtain fast algorithms by exploiting the inherent structure of the underlying system. After performing an Inverse Discrete Cosine Transform, a small number of additional operations are required. Results show the method performs as well as Chebyshev interpolation when approximating smooth functions and better than a wide variety of smooth Chebyshev filters when attempting to approximate rough functions.
One chapter is devoted to analyzing a stochastic norm estimate which is useful when computing low-rank approximations of matrices. This estimate allows us to compute approximations with relative error close to machine precision, which previously was not possible.