Exploiting IB Assignments for Approximating Marginal Probabilities
AIR FORCE INST OF TECH WRIGHT-PATTERSON AFB OH SCHOOL OF ENGINEERING
Pagination or Media Count:
Computing marginal probabilities whether prior or posterior in Bayesian belief networks is a hard problem. This paper discusses deterministic approximation schemes that work by adding up the probability mass in a small number of value assignments to the network variables. Under certain assumptions, the probability mass in the union of these assignments is sufficient to obtain a good approximation. Such methods are especially useful for highly-connected networks, where the maximum clique size or the cutset size make the standard algorithms intractable. In considering assignments, it is not necessary to assign values to variables that are independent of d-separated from the evidence and query nodes. In many cases, however, there is a finer independence structure not evident from the topology, but dependent on the conditional distributions of the nodes. We note that independence-based IB assignments, which were originally proposed as theory of abductive explanations, take advantage of such independence, and thus contain fewer assigned variables. As a result, the probability mass in each assignment is greater than in the respective complete assignment. Thus, fewer IB assignments are sufficient, and a good approximation can be obtained more efficiently. IB assignments can be used for efficiently approximating posterior node probabilities even in cases which do not obey the rather strict skewness assumptions used in previous research. Two algorithms for finding the high probability IB assignments are suggested one by doing a best-first heuristic search, and another by special-purpose integer linear Experimental results show that this approach is feasible for highly connected belief networks.
- Operations Research