# Contents

# New Metric: Normalized Entropy

* Entropy, H(X),* is a key metric in BayesiaLab for measuring the uncertainty associated with the probability distribution of a variable

*. The entropy of a variable can also be understood as the sum of the*

**X***of its states.*

**Expected Log-Losses**It is expressed in bits and defined as follows:

**Example**

Let's consider a binary variable that indicates the color of a ball, which can be either red or blue.

The graph below illustrates the values taken by the entropy * H(X)* as a function of the probability of the ball to be red. The entropy reaches its maximum when the distribution is uniform, when the probability of red equals the probability of blue. It corresponds indeed to the situation where the choice of a color based on this distribution is purely random, and thus the most uncertain.

where * S_{X}* is the number of states of the variable

*.*

**X**Thus, the entropies of variables that do not have the same number of states cannot be directly compared. This is why version 7.0 features now the * Normalized Entropy*, a metric that takes into account the maximum value of the entropy and returns a normalized measure of the uncertainty associated with the variable:

**Example**

Here are two variables with different number of states and different * Entropies*, but identical

**Normalized Entropies:**

The * Normalized Entropy *is available in the tooltip associated with the monitors, in the

**Information Mode**, while hovering over the monitor:

It is also possible to sort the monitors by * Normalized Entropy* via the

**Monitor Panel Contextual Menu**

# New Metric: Normalized Mutual Information

The **Conditional Entropy*** H(X|Y)* measures, in bits, the

*associated with variable*

**E****xpected Log-Loss***once we have information on variable*

**X***:*

**Y**This conditional entropy is thus key for defining what is the mutual information between * X* and

*.*

**Y**The** Mutual Information*** I(X, Y)* measures the amount of information gained on variable

*(the reduction in the expected Log-Loss) by observing variable*

**X***:*

**Y**The **Venn Diagram** below illustrates this concept:

Version 7.0 features now a normalized version of the * Mutual Information* that takes into account the maximum value that can take the entropy of

*:*

**X**The * Normalized Mutual Information* is available in the

**Target Analysis Report (Analysis | Report | Target | Relationship with Target Node)**

It is also available in the** Arc's Mutual Information (Analysis | Visual | Overall | Arc | Mutual Information)**

# New Metric: Symmetric Normalized Mutual Information

As we can see on the **Venn Diagram,** the mutual information * I(X, Y)* is identical for the two variables

*and*

**X****. However, these variables can have a different number of states.**

*Y*Thus, the* Symmetric Normalized Mutual Information* takes into account the maximum values that can take the entropies of

*and*

**X****:**

*Y*The **Symmetric*** Normalized Mutual Information* is available in the

**Relationship Analysis Report (Analysis | Report | Relationship)**

It is also available in the** Arc's Mutual Information (Analysis | Visual | Overall | Arc | Mutual Information)**

# Updated Metric: Relative Mutual Information

This metric measures the percentage of information gained on * X* by observing

**:**

*Y*This metric was named * Normalized Mutual Information* in the previous BayesiaLab's versions.

The * Relative Mutual Information* is available in the

**Target Analysis Report (Analysis | Report | Target | Relationship with Target Node)**

It is also available in the** Arc's Mutual Information (Analysis | Visual | Overall | Arc | Mutual Information)**

# Updated Metric: Symmetric Relative Mutual Information

*and*

**X****:**

*Y*This normalization is calculated akin to the * Pearson Correlation Coefficient*. The mutual information is an analogue to covariance and the entropy is analogous to variance.

The **Symmetric*** Relative Mutual Information* is available in the

**Relationship Analysis Report (Analysis | Report | Relationship)**

** Arc's Mutual Information (Analysis | Visual | Overall | Arc | Mutual Information)**

A closely related metric was named **Symmetric*** Normalized Mutual Information* in the previous BayesiaLab's versions, with a slightly different normalization factor: