The truncated singular value decomposition (SVD) of the measurement matrix is the optimal solution to the _representation_ problem of how to best approximate a noisy measurement matrix using a low-rank matrix. Here, we consider the (unobservable) _denoising_ problem of how to best approximate a low-rank signal matrix buried in noise by optimal (re)weighting of the singular vectors of the measurement matrix. We exploit recent results from random matrix theory to exactly characterize the large matrix limit of the optimal weighting coefficients and show that they can be computed directly from data for a large class of noise models that includes the i.i.d. Gaussian noise case. The analysis reveals why singular value thresholding with convex penalty functions (e.g. nuclear norm) are suboptimal.