Measurements from ice cores and tree rings, among many others, that are natural covariates of surface temperature have been used to produce a large number of global surface temperature reconstructions. While the geophysical problem of locating, measuring and calibrating these so called "proxies" is largely the domain of the earth and climate scientists, the process by which this data is transformed into a surface temperature reconstruction is almost entirely statistical. The reconstuctions are built by fitting a model to time epochs where the longer proxy record overlaps the shorter instrumental period (the last century and a half, at most). The model is the extrapolated backwards in time to produce a temperature reconstruction. This method must confront some imposing obstacles. First, the number of covariates is far larger than the number of target data points; low dimensional approximations to high dimensional data are required. Second, the relationship between the proxies and the local temperature record is weak. Thirdly, the spatial and temporal dependence structure is particularly complex with features not easily captured with simple models. Climate scientists have adapted and invented a variety of statistical methods, based on linear models, that attempt to overcome these obstacles and produce valid reconstructions of the climate "field". In this talk, we review and evaluate these methods with an eye towards understanding their reliability. We also propose two new reconstructions: one based on an estimated posterior density of the historical average surface temperature and a second that uses L1 regularized regression.