Child pages
  • Statistics behind the Prediction with BayesiaLab

Contents

Answer

A BBN is a compact representation of the Joint Probability Distribution (JPD) defined by its associated variables. Once you get this JPD, inference in a BN is simply computing a conditional probability of some subset of variables in the network, conditioned on another subset. For small models ( <10 binary variables) this is a trivial. In general, the "what is computed" is straightforward, but impractical by brute force. The "how it is computed" is the subject of a large public literature on propagation and approximation methods. 

The BayesiaLab’s probabilistic inference is based on the Junction Tree algorithm for exact Inference, and Importance Sampling for approximate inference (when exact inference is impossible due to the BBN complexity).