New Metric: Normalized Entropy
Entropy, Expected Log-Losses., is a key metric in BayesiaLab for measuring the uncertainty associated with the probability distribution of a variable . The entropy of a variable can also be understood as its
It is expressed in bits and defined as follows:
The maximum value of the entropy logarithmically increases with the number of states of the variable:
whereis the number of states of the variable .
Thus, the entropies of variables that do not have the same number of states cannot be directly compared. This is why version 7.0 features now the Normalized Entropy, a metric that takes into account the maximum value of the entropy and returns a normalized measure of the uncertainty associated with the variable:
The Normalized Entropy is available in the tooltip associated with the monitors, in the Information Mode, while hovering over the monitor:
It is also possible to sort the monitors by Normalized Entropy via the Monitor Panel Contextual Menu
New Metric: Normalized Mutual Information
The Conditional Entropymeasures, in bits, the Expected Log-Loss associated with variable once we have information on variable :
This conditional entropy is thus key for defining what is the mutual information betweenand .
The Mutual Informationmeasures the amount of information gained on variable (the reduction in the expected Log-Loss) by observing variable :
The Venn Diagram below illustrates this concept:
Version 7.0 features now a normalized version of the Mutual Information that takes into account the maximum value that can take the entropy of:
New Metric: Symmetric Normalized Mutual Information
As we can see on the Venn Diagram, the mutual informationis identical for the two variables and . However, these variables can have a different number of states.
Thus, the Symmetric Normalized Mutual Information takes into account the maximum values that can take the entropies ofand :
Updated Metric: Relative Mutual Information
This metric measures the percentage of information gained onby observing :
This metric was named Normalized Mutual Information in the previous BayesiaLab's versions.
Updated Metric: Symmetric Relative Mutual Information
This normalization is calculated akin to the Pearson Correlation Coefficient. The mutual information is an analogue to covariance and the entropy is analogous to variance.
A closely related metric was named Symmetric Normalized Mutual Information in the previous BayesiaLab's versions, with a slightly different normalization factor: