We study data processing inequalities (DPI's) that are derived from a certain
class of generalized information measures (IM's), where a series of convex
functions and multiplicative likelihood ratios are nested alternately. A
certain
choice of
the convex functions leads to an IM that extends the notion of the
Bhattacharyya distance (BD): While the ordinary BD is based on the geometric
mean of two
replicas of the channel's conditional distribution, the more general IM allows
an arbitrary number of replicas. We apply the DPI induced by this IM to obtain
lower bounds on signal parameter estimation
and show that tighter bounds are obtained by more than two replicas. The
advantage of this bound over existing bounds becomes apparent when there is
channel uncertainty, like unknown fading.