In this talk, we explore the application of Generalized Approximate Message Passing (GAMP) to binary linear classification and feature selection. We show that GAMP can easily accommodate standard response models (logistic, probit, hinge loss) and outlier-corrupted extensions, as well as standard sparsity-promotion mechanisms for the predictor (ell-1 regularization, sparse probabilistic priors). Moreover, the hyperparameters that govern these models can be automatically tuned using an EM approach. In addition, a state-evolution framework characterizes the algorithm's trajectory under certain assumptions about the feature matrix. Finally, kernel methods can be easily incorporated if desired. Numerical results demonstrate the flexibility, efficiency, and accuracy of this approach.