Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Belief Propagation Implementation

I am trying to implement Bayesian Networks.

My main graph is a factor graph that I want to use for belief propagation. But, in belief propagation when calculating messages, not all the arguments are passed to the function and the final function will be a restriction of the joint distribution.

The best way which comes to my mind is to somehow restrict the functions in order to not to do all of the replacement every time when I want to calculate a marginal for a new value.

I asked how to implement such a function here.

I want to know if there is a better way to do such a thing or are there more simple and faster approaches than the one I want to do.

like image 898
Cupitor Avatar asked Jun 18 '13 08:06

Cupitor


1 Answers

Here's a suggestion: create a closure which accepts a map containing the initial variables and their respective values as its key-value pairs for the first computation. The same closure returns an inner function that accepts another map with the remaining variables and values for the final computation.

So define a closure where the first partial computation is done in the outer function. Based on your link, the partial computation is a sum but I imagine you will be computing products of probabilities. The inner function has access to the partial sum as a free variable. The computation is completed when you invoke it with a map containing the remaining variable-value pairs.

You also can define in the outer function a set to hold all variables used in the first computation. Then allow inner function to access this set as a free variable as well. This will ensure that the values of any variable keys encountered in the first computation are excluded in the final computation.

All of this are illustrated below.

def f1(map1):

    # set to contain seen variables as keys
    seen_keys = set()

    computed_val1 = 0.0

    for key in map1.keys():
        val = map1[key]
        computed_val1 += val

        # remember keys already in 1st computed
        seen_keys.add(key)

    def f2(map2):
        computed_val2 = computed_val1

        for key2 in map2.keys():
            # omit keys in first computation
            if key2 in seen_keys:
                continue

            val2 = map2[key2]
            computed_val2 += val2

        return computed_val2

    return f2

if __name__ == '__main__':

    partial_map = {'factor1': 1, 'factor2': 2}
    func = f1(partial_map)

    remaining_map1 = {'factor3': 3}
    answer1A = func(remaining_map1)
    print "Answer after using partial and remaining maps = ", answer1A

    # complete the partial map with factor3 to show that
    # the return function will ignore variables already seen in 1st computaion
    partial_map['factor3'] = 3
    answer1B = func(partial_map)
    print "Answer with completed map to same func = ", answer1B

    # Compute remaining map with different value for factor 3
    remaining_map2 = {'factor3': 15}
    answer2 = func(remaining_map2)
    print "Answer with different second map = ", answer2
like image 190
rainman Avatar answered Sep 28 '22 17:09

rainman