In information theory, lossless source coding involves the compression of discrete sources with an encoding rate equal to the entropy of the source. Compressed sensing, under a probabilistic prior on the source, deals with the analog compression of a source by taking some linear measurements from it. We introduce iso-entropic matrices and apply them to the almost lossless analog compression of memoryless sources in an information-theoretic framework. In particular, we study the required measurement rate in order to preserve the information isometry. The fundamental limit is shown to be the Renyi information dimension, introduced by Renyi in 1959, and can be achieved by an ensemble of deterministically truncated Hadamard matrices.