# Contents

### Questions

Q1: Which is a better measure to report - KL Divergence or Mutual Information?

Q2: Is it true that the mutual information of a variable to itself is 1?

### Answers

Therefore, Mutual Information (I) and KL Divergence are identical when there are no spouses (co-parents) implied in the measured relation.

**Example**

Let's take the following network with two nodes *X* and *Z*.

The analysis of the relation with Mutual Information (**Validation Mode: Analysis | Visual | Arcs' Mutual Information**) and with KL (**Validation Mode: Analysis | Visual | Arc Force**) return the same value: 0.3436

The percentage in blue in the Arc Force analysis is the relative weight of the link compared to the sum of all the arc forces.

However, as soon as other variables are implied in the relation as co-parents, the KL Divergence will integrate them in the analysis, leading to a more precise result.

**Example**

Let's take the following deterministic example where *Z* is an Exclusive Or between *X* and *Y*, i.e. true when *X* and *Y* are different.

The analysis of the relations with Mutual Information (**Validation Mode: Analysis | Visual | Arcs' Mutual Information**) returns the following graph where the mutual information between *X* and *Z* and *Y* and *Z* are both null.

Indeed, X and Y do not have any impact on Z when they are analyzed separately.

On the other hand, the force of the arcs computed with KL (**Validation Mode: Analysis | Visual | Arc Force**) reflects perfectly the deterministic relation between of *X* and *Y* on *Z*.

Two clones will have a Normalized Mutual Information* I_N(X, X)* = 1 but not necessarily a Mutual Information I(*X, X*)=1. It depends on the value of the initial entropy *H(X).* You will get it with a binary variable *X* that has a uniform marginal distribution*.*