##### Child pages
• Metrics (7.0)

# New Metric: Normalized Entropy

Entropy, H(X), is a key metric in BayesiaLab for measuring the uncertainty associated with the probability distribution of a variable X. The entropy of a variable can also be understood as the sum of the Expected Log-Losses of its states.

It is expressed in bits and defined as follows:

Example

Let's consider a binary variable that indicates the color of a ball, which can be either red or blue.

The graph below illustrates the values taken by the entropy H(X) as a function of the probability of the ball to be red. The entropy reaches its maximum when the distribution is uniform, when the probability of red equals the probability of blue. It corresponds indeed to the situation where the choice of a color based on this distribution is purely random, and thus the most uncertain.

The maximum value of the entropy logarithmically increases with the number of states of the variable:

where SX is the number of states of the variable X.

Thus, the entropies of variables that do not have the same number of states cannot be directly compared. This is why version 7.0 features now the Normalized Entropy, a metric that takes into account the maximum value of the entropy and returns a normalized measure of the uncertainty associated with the variable:

Example

Here are two variables with different number of states and different Entropies, but identical Normalized Entropies:

The Normalized Entropy is available in the tooltip associated with the monitors, in the Information Mode , while hovering over the monitor:

It is also possible to sort the monitors by Normalized Entropy via the Monitor Panel Contextual Menu

The Normalized Entropy is also available as a new Node Analysis metric (Size and Color) in the 2D and 3D Mapping Tools.

# New Metric: Normalized Mutual Information

The Conditional Entropy H(X|Y) measures, in bits, the Expected Log-Loss associated with variable once we have information on variable Y:

This conditional entropy is thus key for defining what is the mutual information between X and Y.

The Mutual Information I(X, Y) measures the amount of information gained on variable X (the reduction in the expected Log-Loss) by observing variable Y:

The Venn Diagram below illustrates this concept:

Version 7.0 features now a normalized version of the Mutual Information that takes into account the maximum value that can take the entropy of X:

The Normalized Mutual Information is available in the Target Analysis Report (Analysis | Report | Target | Relationship with Target Node)

It is also available in the Arc's Mutual Information (Analysis | Visual | Overall | Arc | Mutual Information)

# New Metric: Symmetric Normalized Mutual Information

As we can see on the Venn Diagram, the mutual information  I(X, Y) is identical for the two variables X and Y. However, these variables can have a different number of states.

Thus, the Symmetric Normalized Mutual Information takes into account the maximum values that can take the entropies of X and Y:

The Symmetric Normalized Mutual Information is available in the Relationship Analysis Report (Analysis | Report | Relationship)

It is also available in the Arc's Mutual Information (Analysis | Visual | Overall | Arc | Mutual Information)

# Updated Metric: Relative Mutual Information

This metric measures the percentage of information gained on X by observing Y:

This metric was named Normalized Mutual Information in the previous BayesiaLab's versions.

The Relative Mutual Information is available in the Target Analysis Report (Analysis | Report | Target | Relationship with Target Node)

It is also available in the Arc's Mutual Information (Analysis | Visual | Overall | Arc | Mutual Information)

# Updated Metric: Symmetric Relative Mutual Information

This metric measures the percentage of information gained by X and Y:

This normalization is calculated akin to the Pearson Correlation Coefficient. The mutual information is an analogue to covariance and the entropy is analogous to variance.

The Symmetric Relative Mutual Information is available in the Relationship Analysis Report (Analysis | Report | Relationship)

It is also available in the Arc's Mutual Information (Analysis | Visual | Overall | Arc | Mutual Information)

A closely related metric was named Symmetric Normalized Mutual Information in the previous BayesiaLab's versions, with a slightly different normalization factor: