Generation of random variables with prescribed distributions from coin flips arises in applications such as Monte Carlo simulation, randomized algorithms and cryptography. In a seminal paper, Knuth and Yao showed that entropy is the limit on the number of random bits needed for "local" randomness generation. In this talk, we consider three "remote" one-shot generation settings, namely distributed generation, channel simulation with and without common randomness, and universal remote generation. We show that the amount of randomness needed in each case can again be bounded using information-theoretic quantities. The proofs of these results involve several new techniques, including a dyadic decomposition generation scheme, a new quantity called erosion entropy, and a strong functional representation lemma.