Mutual information
Suppose we have two random variables
and over different domains and . Then the mutual information is defined to be Where is the information entropy of and is conditional entropy.
Mutual information
Suppose we have two random variables
and over different domains and . Then the mutual information is defined to be Where is the information entropy of and is conditional entropy.