This talk is on stochastic subgradient mirror-descent method for solving constrained convex minimization problems. In particular, a stochastic subgradi- ent mirror-descent method with weighted iterate-averaging is discussed and its per-iterate convergence rate is analyzed. The novel part of the approach is in the choice of weights that are used to construct the averages. Through the use of these weighted averages, we show that the known optimal rates can be obtained with simpler algorithms than those currently existing in the literature. The stepsize choices that achieve the best rates are those proposed by Paul Tseng for acceleration of proximal gradient methods.