The ability to evaluate nonlinear function classes rapidly is crucial for nonparametric estimation. We propose an improvement to random kitchen sinks that offers $O(n \log d)$ computation and $O(n)$ storage for $n$ basis functions in $d$ dimensions without sacrificing accuracy. We show how one may adjust the regularization properties of the kernel simply by changing the spectral distribution of the projection matrix. Experiments show that we achieve identical accuracy to full kernel expansions and random kitchen sinks 100x faster with 1000x less memory.