Given two discrete random variables $X$ and $Y$, an operational approach is undertaken to quantify the ``leakage'' of information from $X$ to $Y$. The resulting measure $mathcal{L}(X to Y)$ is called emph{maximal leakage}, and is defined as the multiplicative increase, upon observing $Y$, of the probability of correctly guessing a randomized function of $X$, maximized over all such functions. The definition is shown to be robust, and the resulting properties are consistent with an axiomatic view of a leakage measure. Maximal leakage is sandwiched between Shannon capacity and local differential privacy, which are given operational interpretations in this guessing framework, as is maximal correlation. Furthermore, the Shannon cipher system is studied using maximal leakage as a performance metric. The optimal limit is derived, and asymptotically-optimal encryption schemes are demonstrated.