We devise a novel subsampling-based model selection method for regularized linear regression, e.g., ridge regression or Lasso, that can be carried out in a distributed fashion. Such development is computationally beneficial particularly for the modern applications in which the data sizes grow drastically, and we want to do model selection efficiently. Our method is inspired by a recently developed parallelized bootstrap method, the Bag-of-Little-Bootstrap (BLB), which subsamples data multiple times. Experiments on both synthetic and real fMRI data set show that our model selection criterion works well for both ridge regression and Lasso, while the running time is an order of magnitude smaller than cross-validation (CV) or recently developed estimation-stability cross validation (ES-CV).