The use of millimeter wave (mmWave) spectrum for future cellular systems is an intriguing prospect, but relies on the use of highly directional beamforming and dense base station deployments. Compared to conventional microwave cellular systems, several fundamental models and behaviors need to be revisited including path loss and interference models, and their effect on coverage/outage probability. Using actual building locations in major US metro areas in conjunction with empirically supported new path loss models, we observe that mmWave cellular networks tend to be noise limited and the coverage probability relies on a user being able to connect to a LOS (or LOS-like) base station. We also propose a general and tractable model that captures these new trends.