Child pages
  • Mutual Information Mapping

Versions Compared


  • This line was added.
  • This line was removed.
  • Formatting was changed.


To overcome this challenge we turn to concepts from derived from information theory. In particular, we use mutual information. Mutual information is a quantity that measures the mutual dependence of two variables, i.e. how much knowing about one variable reduces the uncertainty regarding the other variable. It is intuitive, that, on a random issue, the vote of a Democrat would be more informative regarding another Democrat's vote than regarding the vote of a Republican.

Widget Connector

As its name implies, the Mutual Information Map layout algorithm that is implemented in BayesiaLab utilizes mutual information. More specifically, Mutual Information Map is a layout algorithm that computes the mutual information matrix between all nodes and then uses a genetic algorithm to search for a node layout such that the proximity of two nodes is inversely proportional to their mutual information. Put more simply, the closer the are nodes positioned relative to each other, the greater the mutual information between them.