Estimation of mutual information from observed samples is a classical problem in information theory, and there has been renewed interest in this problem. While mutual information is a quantity defined for very general probability spaces, estimators have been developed only in the special case of discrete or continuous pairs of random variables. In this paper, we develop an estimator for estimating mutual information in discrete-continuous mixtures. We prove the consistency of this estimator theoretically as well as demonstrate the excellent practical performance on both simulated data and on a problem in gene network inference.