LEAST SQUARES ESTIMATION AND VARIABLE SELECTION UNDER MINIMAX CONCAVE PENALTY Cun-Hui Zhang, Rutgers University Statistical inference about a sparse high-dimensional signal vector is well understood if the signal is directly observed with white noise. If most components of the signal are zero, threshold estimators at level univ= p 2 logpfind the exact set of nonzeros with high probability when the minimum absolute value of the nonzero components is slightly greater than univ. If the signal vector belongs to a small‘rball of radiusR, threshold estimators at a certain level mmapproximately attain the minimax risk. We show that in linear regression models withndata points andp > nunknowns, these results can be extended with concave penalized LSE, provided that the number of significant components orRr= r mmof the signal vector is no greater than a certaind and the order of thisd could be as high asn=log(p=n). Implications of such results on sparse recovery will be discussed if time permits.