Symmetric Relative Mutual Information

Definition

Symmetric Relative Mutual Information computes the percentage of information gained by observing XX and YY:

ISR(X,Y)=I(X,Y)H(X)×H(Y){I_{SR}}(X,Y) = \frac{{I(X,Y)}}{{\sqrt {H(X) \times H(Y)} }}

This normalization is calculated similarly to Pearson's Correlation Coefficient ρ\rho.

ρX,Y=cov(X,Y)σX2σY2{\rho _{X,Y}} = \frac{{{\mathop{\rm cov}} (X,Y)}}{{\sqrt {\sigma _X^2\sigma _Y^2} }}

where σ2\sigma ^2 denotes variance.

So, Mutual Information is comparable to covariance, and Entropy is analogous to variance.

Usage

For a given network, BayesiaLab can report the Symmetric Relative Mutual Information in several contexts:

  • Main Menu > Analysis > Report > Relationship Analysis:

  • The Symmetric Normalized Mutual Information can also be shown by selecting Main Menu > Analysis > Visual > Overall > Arc > Mutual Information and then clicking the Show Arc Comments icon or selecting Main Menu > View > Show Arc Comments.

  • Note that the corresponding options under Preferences > Analysis > Visual Analysis > Arc's Mutual Information Analysis have to be selected first:

  • In Preferences, Child refers to the Relative Mutual Information from the Parent onto the Child node, i.e., in the direction of the arc.

  • Conversely, Parent refers to the Relative Mutual Information from the Child onto the Parent node, i.e., in the opposite direction of the arc.

Last updated

Logo

Bayesia USA

info@bayesia.us

Bayesia S.A.S.

info@bayesia.com

Bayesia Singapore

info@bayesia.com.sg

Copyright © 2024 Bayesia S.A.S., Bayesia USA, LLC, and Bayesia Singapore Pte. Ltd. All Rights Reserved.