Large-scale iterative learning problems pose new challenges to coded computing, requiring joint optimization of coding gain and convergence rate. In this work, we obtain coded computing techniques for iterative learning problems using a new decoding algorithm that is inspired from weighted least squares. This decoding algorithm prevents an often-ignored numerical issue of "error amplification" in decoding real-number codes, and yields faster convergence rate than replication-based and uncoded methods. We test our algorithm on the distributed computation of personalized PageRank problem implemented in real systems and observe 4 orders of magnitude reduction in MSE under a computation deadline.