We discuss computationally efficient Sparse Regression Codes (SPARCs) for communication and lossy compression. Codewords are linear combinations of columns of a design matrix. These codes were introduced by Barron \& Joseph and proven to achieve the AWGN capacity with computationally feasible decoding. Here we show how SPARCs can be used for efficient lossy compression. An encoder based on successive approximation is presented and shown to achieve the optimal distortion-rate function for iid Gaussian sources with squared error distortion. We then demonstrate how to combine the source and channel coding SPARCs to implement random binning and superposition. Sparse Regression codes therefore offer a way to design fast rate-optimal codes for a variety of models in network information theory.