Nonparametric regressors on ${R^D}$ are subject to a so-called 'curse of dimension': there always exists a distribution such that the regressor, given $n$ samples, converges at a rate of the form $n^{-2/(2+D)}$. In other words, we need a sample size $n$ exponential in $D$ in order to approximate the unknown function. Fortunately, as we will see, there are common real-world scenarios where better regression rates are achievable for data in ${R^D}$. We derive notions of 'local dimension' that better capture real-world scenarios. The local dimension is often smaller than the dimension $D$. We show that common 'local regressors' operating on ${R^D}$, converge at rates that depend just on the local dimension and not on $D$. These are often better rates than the worst-case of $n^{-2/(2+D)}$.