Sparse regression codes with approximate message passing (AMP) decoding have been recently shown to asymptotically achieve the capacity of the AWGN channel, with computationally efficient decoding. Here, we refine the asymptotic result by deriving a large deviations bound on the probability of AMP decoding error. This bound shows that for an appropriate choice of code parameters and a fixed rate smaller than the AWGN capacity, the probability of decoding error decays exponentially in n/(log n)^2T. Here, T the number of AMP iterations required for successful decoding, is proportional to the logarithm of the ratio of channel capacity to rate. We also show how the large deviations result guides the choice of code parameters to optimize empirical performance.