I will consider the problem of generating an i.i.d. string at the output of a discrete memoryless channel using a limited amount of randomness at its input, known as "channel resolvability." I will, first, discuss the exponential decay rate of the informational divergence between the output distribution of the channel, when a resolvability code is used at its input, and the product (i.i.d.) measure in the code block-length, called the resolvability exponent, and present the ensemble-optimal resolvability exponents when a randomly constructed resolvability code is employed. Next, I will talk about the problem of channel resolvability in presence of causal noiseless feedback. Feedback allows the encoder to adapt the entropy rate it consumes based on the channel behavior using variable-length resolvability schemes but it does not reduce the minimum average entropy rate required to achieve an accurate approximation of an i.i.d. output string. However, rate adaptation, possibly, allows one to achieve higher resolvability exponents. I will present two examples, binary symmetric channel and binary erasure channel, for which, using a variable-length resolvability code and in presence of feedback, the achievable resolvability exponents can be substantially improved.