Differential privacy is a popular notion of privacy aimed at quantifying and controlling the risk for an individual to participate in a statistical database. We discuss scenarios where one would like to provide such privacy guarantees for dynamical systems, e.g., monitoring and control systems such as smart grids or intelligent transportation systems. We then discuss how system theoretic and signal processing tools allow us to design differentially private filters for various types of dynamic data. We remark that if statistical models of the users' signals are available, exploiting them can lead to significant performance improvements for differentially private mechanisms, without sacrificing the privacy guarantees.