Many quantities of interest in modern statistical analysis are non-smooth functionals of the underlying generative distribution, the observed data, or both. Examples include the test error of a learned classifier, parameters indexing an estimated optimal decision policy, and coefficients in a regression model after model selection has been performed. This lack of smoothness can lead to non-regular asymptotics under likely real-life scenarios and thus invalidate standard statistical procedures like the bootstrap or series approximations. The focus of this talk is the development of tools for conducting valid statistical inference for non-smooth functionals by means of smooth data-driven upper and lower bounds. More specifically, we sandwich the non-smooth functional between two smooth bounds and then approximate the distribution of the bounds using standard techniques. We then use estimated distributional features of the bounds, such as the quantiles, to make inference for the original non-smooth functional. We leverage the smoothness of the bounds to obtain consistent inference under both fixed and local alternatives. This consistency is important for high-quality performance in both large and small samples. Another important feature of these bounds is that they are adaptive to the underlying smoothness of the functional. That is, they are asymptotically tight in the case when the generative distribution happens to induce sufficient smoothness. A suite of empirical examples show that confidence intervals formed using the techniques presented in this talk perform favorably to competitors.