We improve the running times of algorithms for least squares regression and low-rank approximation to account for the sparsity of the input matrix. Namely, if $\texttt{nnz}(A)$ denotes the number of non-zero entries of an input matrix $A$: - we show how to solve approximate least squares regression given an ${n \times d}$ matrix $A$ in $\texttt{nnz}(A) + poly(d log n)$ time - we show how to find an approximate best rank-$k$ approximation of an $n \times n$ matrix in $\texttt{nnz}(A) + n*poly(k log n)$ time All approximations are relative error. Previous algorithms based on fast Johnson-Lindenstrauss transforms took at least $ndlog d$ or \texttt{nnz}(A)*k time. We have implemented our algorithms, and preliminary results suggest the algorithms are competitive in practice.