In distributed optimization, the complete optimization problem is not available at a single location but is distributed among different agents. The distributed optimization problem is additionally stochastic when the information available to each agent comes with stochastic errors. Stochastic errors arise naturally when the objective function known to an agent has a random component with unknown statistics; they also model communication and quantization errors. Communication constraints, lack of global information about the network topology and the absence of coordinating agents make it infeasible to collect all the information at a single location. Thus, the optimization problem has to be solved using algorithms that are distributed, i.e., different parts of the algorithm are executed at different agents, and local, i.e., each agent uses only information locally available to it and other information it can obtain from its immediate neighbors. In this paper, we will primarily focus on the specific problem of minimizing a sum of functions over a constraint set, when each component function is known partially (with stochastic errors) to a unique agent. The constraint set is known to all the agents. We propose three distributed and local algorithms, establish their asymptotic convergence with diminishing step-sizes, and obtain rate of convergence results. We illustrate the application of these algorithms to the problem of estimating the location and intensity of a heat source using measurements from a distributed set of sensors.