Partial least squares (PLS) regression combines dimensionality reduction and prediction. As predictor dimension increases, variable selection becomes essential to avoid over-fitting, to provide more accurate predictors and to yield more interpretable parameters. We propose a global variable selection approach that penalizes the total number of variables across all PLS components by formulating PLS with global sparsity, mixed L1/L2 norm sparsity constraint on the weight matrix, as a variational optimization problem. A novel augmented Lagrangian method is proposed to solve the optimization problem and soft thresholding for sparsity occurs naturally as part of the iteration. Experiment results show that the modified PLS attains better performance with many fewer selected predictors.