Interference is a central feature of wireless communtication. In many scenarios, interference is {\em bursty}: interfering wireless links come and go. Designing the system assuming interference to be always present is very conservative. In this paper, we take a fundamental information theoretic stand point and address the issue of statistical gain associated with bursty interference in the context of a pair of unicast interfering wireless links. Modeling the problem as a ``degraded message set" two user Gaussian interference channel, we approximately characterize the symmetric capacity region. Our results demonstrate the fundamental existence of three regimes: one where treating interference as always there is without loss of optimality, another where one can harness as well as if interference was never there and a third where the performance is in between these two regimes.