We investigate the asymptotic (in blocklength) tradeoffs between rate and minimum distance, for codes that provide unequal error protection (UEP). Two notions of UEP are analyzed: bit-wise, where a subset of bits is special and needs more protection, and message-wise, where a subset of the message-set is special. Both notions are analyzed for two cases: binary, and large-alphabet. In message-wise UEP for the binary channel alphabet, it turns out that the special messages and ordinary messages can simultaneously achieve the Gilbert-Varshamov bound at their respective rates. Similar “successive refinement” of the Singleton bound is shown to hold for large non-binary alphabets. We also analyze the situation when there is only one special message. In bit-wise UEP, it is shown that when the ordinary bits are achieving the Singleton bound, even a single special bit cannot achieve any larger distance. For the binary case, an upper bound is provided on the protection of the single special bit. These coding theoretic limits in terms of Hamming distances are close analogues of the information theoretic limits in terms of error exponents by Borade et al.