The popularity of the Gibbs sampler stems from its simplicity and its power to samples from high-dimensional probability distribution. It can sometimes, however, be slow to convergence, especially with highly structured complex models. The recently proposed Partially Collapsed Gibbs (PCG) sampler (van Dyk and Park, 2008, {\it JASA}; Park and van Dyk, 2009, {\it JCGS}) reduces the conditioning in the component draws of a Gibbs sampler to significantly improve convergence. PCG must be implemented with care, however, because its conditional distributions may be functionally incompatible and permuting the order of the draws can change the stationary distribution of the chain. As in an ordinary Gibbs sampler, we sometimes find that one or more of the conditional draws of a PCG sampler is not available in closed form and we may consider implementing such draws with the help of the Metropolis-Hastings sampler. Doing so, however, must be done with care as it may alter the stationary distribution of the chain. This can happen even when the PCG sampler would work perfectly well if all the conditional updates were available without resorting to Metropolis updates. In this talk we illustrates the difficulties that may arise when using Metropolis-Hastings updates within a PCG sampler and develop a general strategy for using such updates while maintaining the target stationary distribution. Our proposed computational methods are illustrated in an example from High-Energy Astrophysics that involves sampling from the complex distributions that arise when uncertainty in instrument calibration is quantified via a Bayesian posterior distribution.