Manifold learning algorithms map the data to a (weighted) graph that encodes their neighborhood relations. This requires the knowledge of the "neighborhood scale", epsilon, or alternatively of the average number of nearest neighbors at that scale. Most of what is known about setting the scale parameter epsilon is asymptotic, in the form of rates of change depending of unknown manifold parameters. We consider the problem of setting epsilon based on the actual data, and offer a solution that is both practical and principled. We do this by exploiting the connection between manifold geometry, represented by the Riemannian metric, and the Laplace-Beltrami operator, to optimze how the graph Laplacian preserves the geometry of the data. Experiments show that our method is effective and robust.