Lossless compression of a discrete information source to its entropy rate is a well studied topic. A lesser known approach to this problem is one based on symbolic dynamical systems, where the information generating mechanism is modeled by a randomly initialized iterative mapping of the unit interval to itself, and the emitted source sequence is a quantized observation of that process. For well behaved mappings, this dynamical model leads to a natural lossless compression scheme attaining the entropy rate of the source. In this talk, we show how dynamical models can be further utilized for lossy compression, assuming a feedforward link is available. To that end, we model a source via a two-dimensional symbolic dynamical system, where one component corresponds to the lossy reconstruction, and the other essentially corresponds to the feedforward signal. For a memoryless source and an arbitrary bounded distortion measure, we describe a specific quasilinear dynamical model based on our recently suggested `posterior matching' principle. Combining elements from continuous state-space Markov chain theory with contraction properties of iterated function systems, we show this construction leads to a family of simple deterministic compression schemes attaining the rate-distortion function of the source. We conclude the talk with some simple extensions and a brief discussion of future challenges.