New upper and lower bounds are given for joint entropy of a collection of random variables, in both discrete and continuous settings. These bounds generalize well-known information inequalities due to Han, and yield new determinantal inequalities. In addition, a number of applications are suggested, including a new bound on the number of independent sets of a graph that is of interest in discrete mathematics, and a bound on the number of zero-error codes.