The connection between the mutual information and the error probability for discrete constellations over the scalar additive white Gaussian noise channel is characterized in the limit as the signal-to-noise ratio (SNR) tends to infinity. More specifically, for any arbitrary input distribution the ratio between the conditional entropy of the channel input given the output and the symbol error probability is shown to be equal to $pi$ when the SNR tends to infinity. A similar relationship between the generalized mutual information for bit-interleaved coded modulation (BICM-GMI) and the bit-error probability is also presented. Also, the long-standing conjecture that Gray codes are the binary labelings that maximize the BICM-GMI at high SNR is proven.