We study the learnability of sets in R^n under the Gaussian distribution, taking Gaussian Surface Area as the "complexity measure" of the sets being learned. Our approach yields the first subexponential time algorithm for learning any convex set in R^n with respect to any Gaussian distribution and succeeds even in the agnostic model of learning. We also give an n^{O(log k)} time algorithm for learning the intersection of k halfspaces (to any constant error accuracy), improving on the previous best bound of n^{O(k)} due to Vempala. This talk will survey other known results on the complexity of learning intersections of halfspaces over various domains. On a technical level, we will describe new relationships among learning convex sets, real approximation theory, and concentration of measure.