Many modern data sources arising from social networks, biological data, etc. are best viewed as indexed by combinatorial structures such as graphs, rather than as time series. The local weak limit theory for sparse graphs, also known as the objective method, due to Benjamini, Schramm, Aldous, Steele, Lyons and others, provides a framework in which to make sense of what one might mean by a stationary process indexed by graphs. The theory of time series is the engine driving an enormous range of applications in areas such as control theory, communications, information theory and signal processing, to name just a few fields. It is to be expected that a theory of stationary stochastic processes indexed by combinatorial structures, in particular graphs, would eventually have a similarly wide ranging impact. This talk will describe two recent pieces of work done by the authors to flesh out this perspective. One is a theory of universal lossless data compression of stationary graph-indexed stochastic processes. The second is a methodology to understanding load balancing in large resource allocations problems that can be described by bipartite graphs (or, equivalently, hypergraphs). Consider a graph whose edges carry marks, viewed as data samples associated to the edge. For example the vertices might be individuals and an edge mark might describe the degree of affinity between the corresponding pair of individuals. We pose the problem of representing the information content of such a marked graph in the most efficient way possible. Further, we would like to do this in a universal sense, i.e., in effect, without making any statistical assumptions about the data sample being compressed. It turns out that one can make precise sense of this question in the language of the local weak limit theory, the only assumption being that the graph is sparse, i.e. that the number of edges is on the same scale as the number of vertices, with a given mean degree and a tree-like limit. What is more, the compression can be done in the most efficient way possible. Namely one can make sense of a notion of entropy associated to the local weak limit, which is on a linear scale in the number of nodes, and one can compress down to this entropy in a universal sense. The entropy notion we work with is a generalized version of the one introduced by Bordenave and Caputo (2014). We also illustrate how, with the BC entropy, one can naturally find counterparts of concepts in classical information theory, e.g. the notions of joint entropy, conditional entropy, and mutual information, as well as a calculus of entropy. As our univeral compression work shows, one can associate operational meanings to the BC entrpy in the context of distributed source coding, decoding in the presence of side information at the decoder, etc. As another example of the use of the local weak limit theory, we discuss the problem of load balancing on hypergraphs, where each hyperedge carries load to be distributed among its vertices. An allocation is said to be balanced if the total loads at vertices are equal to the extent possible. Hajek analyzed this problem for finite hypergraphs and showed that while a balanced allocation need not be unique, all balanced allocations result in the same total load at each vertex. We show how to characterize the distribution of balanced load at a typical vertex in a sequence of increasingly large hypergraphs through a recursive distributional fixed point equation arising from the local weak limit theory. In effect the unique balanced load distribution on the finite hypergraphs that are approaching the limit can be viewed as analogs of periodic times series that then converge to an analog of a stationary time series in the limit.