Lab 5: Bayes Nets

From 6.034 Wiki

(Difference between revisions)
Jump to: navigation, search
(Problems: admissible probability section, maybe?)
(Parameter-counting and independence: barely admissible)
Line 89: Line 89:
== Parameter-counting and independence ==
== Parameter-counting and independence ==
-
TODO
+
 
 +
Here are two more things we can do with Bayes nets.
 +
 
 +
 
 +
First, implement a function that returns the number of parameters in the
 +
network. Note that we are no longer assuming boolean variables, so some variables can take on more than two values.
 +
 
 +
def number_of_parameters(net) :
 +
 
 +
 
 +
 
 +
Second, implement a function that checks independence:
 +
is_independent(<tt>network</tt>, <tt>var1</tt>, <tt>var2</tt>, <tt>givens=None</tt>)
 +
 
 +
<!--This function should return True if <tt>var1</tt> and <tt>var2</tt> are independent
 +
given the <tt>givens</tt>.  Otherwise, return false.-->
 +
If givens are supplied, return True if var1 and var2 are
 +
conditionally independent given the givens. If givens are not
 +
supplied, return True if var1 and var2 are marginally independent.
 +
Otherwise, return False.
 +
 
 +
 
 +
Recall that variables can be independent either because of
 +
the topology of the network (structural independence), or because of their conditional
 +
probability table entries (numerical independence). 
 +
It is sufficient to check only numerical independence, because variables that are structurally independent are guaranteed to also be numerically independent.  (That is, you can implement by computing and comparing probabilities -- no need to code d-separation.)
 +
 
 +
 
 +
Hint: The helper function approx_equal may be useful for comparing probabilities, which are floats.
= API =
= API =

Revision as of 07:54, 25 November 2015

Contents


This is an entirely optional lab for you to practice manipulating Bayes nets and probabilities computationally. This lab is worth no credit whatsoever. There will be no online tests. (However, if you would like to provide feedback to improve this new lab for next year, see below.) TODO

To get the code...

Your answers will go in the main file lab_bayes.py.

Problems

Descendants and non-descendants

Implement a function that returns a set containing the descendants of the variable in the network. This set should include the variable's children, its children's children, etc.

def get_descendants(net, var)

Implement a function that returns a set containing the non-descendants of the variable. Note that a variable is neither a descendant nor a non-descendant of itself.

def get_nondescendants(net, var):


Probability

Hypotheses and givens are expressed as dicts assigning variables to values. For example, P(A=False | B=True, C=False) is represented as the two dicts:

hypothesis = {A: False}
givens = {B: True, C: False}

First, write a helper function:

def remove_nondescendants_given_parents(net, var, givens):

If all parents of var are given, this function returns a new dict of givens with var's non-descendants (except parents) removed. Otherwise, the function returns False. In either case, it should not modify the original dict of givens.

Hint: The set method .issubset may be useful here.


Now, we will implement a function that looks up probabilities in the net's conditional probability tables:

def probability_lookup(net, hypothesis, givens=None):

If the probability can be looked up directly (or reduced to a form that can be looked up directly) in the network's probability tables, return it; otherwise return None.

Remember that you can simplify some probability expressions using the independence assumptions of the Bayes net.


Given a dict assigning every variable in the network to a value, compute its joint probability (e.g. {"A": True, "B": False, "C": False}):

probability_joint(net, hypothesis)

Use the chain rule to compute joint probabilities in terms of values produced by probability_lookup. You may assume that the hypothesis represents a valid joint probability (that is, contains every variable in the Bayes net).


A marginal probability can be represented as a sum of joint probabilities:

probability_marginal(net, hypothesis)

Compute marginal probabilities as sums of joint probabilities produced by probability_joint.

Hint 1: The BayesNet method net.combinations may be useful. (See the API below for details.)

Hint 2: To combine two dictionaries d1 and d2 to form a third dict d3, you can use d3 = dict(d1, **d2).


Some conditional probabilities can be looked up in the Bayes net using probability_lookup. The rest can be computed as ratios of marginal probabilities.

probability_conditional(net, hypothesis, givens=None)


Use all of the above types of probability to produce a function that can compute any probability expression in terms of the Bayes net parameters:

def probability(net, hypothesis, givens=None) :


Parameter-counting and independence

Here are two more things we can do with Bayes nets.


First, implement a function that returns the number of parameters in the network. Note that we are no longer assuming boolean variables, so some variables can take on more than two values.

def number_of_parameters(net) :


Second, implement a function that checks independence:

is_independent(network, var1, var2, givens=None)

If givens are supplied, return True if var1 and var2 are conditionally independent given the givens. If givens are not supplied, return True if var1 and var2 are marginally independent. Otherwise, return False.


Recall that variables can be independent either because of the topology of the network (structural independence), or because of their conditional probability table entries (numerical independence). It is sufficient to check only numerical independence, because variables that are structurally independent are guaranteed to also be numerically independent. (That is, you can implement by computing and comparing probabilities -- no need to code d-separation.)


Hint: The helper function approx_equal may be useful for comparing probabilities, which are floats.

API

The file bayes_api.py defines the BayesNet class, as well as some helper functions, all described below.

BayesNet

TODO

Helper functions

TODO

Feedback

Do you want to help us improve this new lab for next year? If so, here are two ways you can help:

  • Email your feedback (and optionally your lab_bayes.py file, so we can use it to improve the tests) to Jessica at jmn@mit.edu with the subject line 'lab_bayes feedback'. [TODO mailto:...]
  • Provide anonymous feedback through this form [TODO]

If you have additional feedback on previous labs, we're happy to receive that as well!

Personal tools