Conditional Entropy

Conditional Entropy represents the amount of uncertainty remaining about the #channel input after the channel output has been observed (which is different from Entropy which measure the information in the source). It is defined mathematically by:

$$ H(A_X|A_Y) = \sum^{K-1}_{k=0} \sum^{J-1}_{j=0} p(x_j|y_k) p(y_k) \lg \frac{1}{p(x_j|y_k)} $$

Note: Notice the relationship of Join Probability Distribution#.

Links to this page
#math