We investigate problems in scanning of multidimensional data arrays, such as universal scanning and prediction (``scandiction", for short), of noiseless and noisy data arrays. These problems arise in several aspects of image and video processing. In predictive coding, for example, an image is compressed by coding the prediction error sequence resulting from scandicting it. Thus, it is natural to ask what is the optimal method to scan and predict a given image, if there exist universal scandiction schemes and what is the resulting scandiction loss. First, given a random field, we examine whether there exist universal scandiction schemes. This question is answered in the affirmative for the set of all spatially stationary random fields and under mild conditions on the loss function. We then discuss the scenario where a non-optimal scanning order is used, yet accompanied by an optimal predictor, and derive a bound on the excess loss compared to optimal scandiction. For individual data arrays, where we show that universal scandictors with respect to arbitrary finite scandictor sets do not exist, the Peano-Hilbert scan is shown to have a uniformly small redundancy compared to optimal finite state scandiction. Finally, we examine the scenario where the random field is corrupted by noise, but the scanning and prediction (or filtering) scheme is judged with respect to the underlying noiseless field. A special emphasis is given to the interesting scenarios of binary random fields communicated through binary symmetric channels and Gaussian random fields corrupted by additive white Gaussian noise.