Sketching techniques have become popular for scaling up machine learning algorithms by reducing the sample size or dimensionality of massive data sets, while maintaining the statistical power. We study sketching from an optimization point of view: we first show that the iterative Hessian sketch is an optimization process with preconditioning, and develop accelerated iterative Hessian sketch via the searching the conjugate direction; we then establish primal-dual connections between the Hessian sketch and dual random projection, and apply the preconditioned conjugate gradient approach on the dual problem, which leads to the accelerated iterative dual random projection methods. Finally, we propose the primal-dual sketch, which iteratively sketches the primal and dual formulations and show that using a logarithmic number of calls to solvers of small scaled problem, primal-dual sketch is able to recover the optimum of the original problem up to arbitrary precision.