\$35.00

## Description

Introduction

In this assignment you will implement variable elimination for Bayes Nets.

What is supplied: Python code that implements Variable, Factor, and BN objects. The file bnetbase.py contains the class definitions for these objects. The code supports representing factors as tables of values indexed by various settings of the variables in the factor’s scope.

The template file bnetbase.py also contains function prototypes for the functions you must implement.

Question 1. Implement Variable Elimination (worth 60/100 marks)

Implement the following functions that operate on Factor objects and then use these functions to implement

VE (variable elimination):

multiply factors (worth 10 points). This function takes as input a list of Factor objects; it creates and returns a new factor that is equal to the product of the factors in the list. Do not modify any of the input factors.

restrict factor (worth 10 points). This function takes as input a single factor, a variable V and a value d from the domain of that variable. It creates and returns a new factor that is the restriction of the input factor to the assignment V = d. Do not modify the input factor.

sum out variable (worth 10 points). This function takes as input a single factor, and a variable V ; it creates and returns a new factor that is the result of summing V out of the input factor. Do not modify the input factor.

VE (worth 30 points). This function takes as input a Bayes Net object (object of class BN), a variable that is the query variable Q, and a list of variables E that are the evidence variables (all of which have had some value set as evidence using the variable’s set evidence interface). Compute the probability of every possible assignment to Q given the evidence specified by the evidence settings of the evidence variables. Return these probabilities as a list, where every number corresponds the probability of one of Q’s possible values. Do not modify any factor of the input Bayes net.

 Assignment 4, University of Toronto, CSC384 – Intro to AI, Winter 2020 3

Question 2: Problem Solving with your VE Implementaion (worth 40/100

marks)

For the following questions, you will submit your answers using the Google form that is located at https:

1. Use your implementation to answer questions regarding the following Bayesian network. All vari-

ables are binary, with values ( ) = f ; ˜g, ( ) = f ; ˜g, etc.

Dom A a a Dom B b b

The probability table values are as follows (probabilities add to one so you can compute the remain-ing probability values):

 P(a) = 0:9 ˜ ˜ P(bjah) = )=0:6 1:0; P(bjah) = 0:0; P(bja˜; h) = 0:5; P(bjah˜ ˜ ˜

P(cjbg) = 0:9; P(cjbg˜) = 0:9; P(cjb; g) = 0:1; P(cjbg˜) = 1:0

P(djc f ) = 0:0; P(djc f˜) = 1:0; P(djc˜; f ) = 0:7; P(djc˜ f˜) = 0:2

P(ejc) = 0:2; P(ejc˜) = 0:4

P( f ) = 0:1

P(g) = 1:0

P(h) = 0:5

( j ) = 0:3; ( j˜) = 0:9

P i b P i b

Using your Variable Elimination implementation (or by hand!), compute the following probabilities and post these to the Google Form (worth 10 points):

1. P(bja)

1. P(cja)

1. P(cjae˜)

1. P(cja f˜)

 Assignment 4, University of Toronto, CSC384 – Intro to AI, Winter 2020 4
1. Next, examine the file carDiagnosis.py. This specifies a Bayes Net for diagnosing various reasons why a car might not start. The layout of this Bayes Net is shown below, and the various CPTs for the Net are specified in carDiagnosis.py as Factors:

Each variable of the Net is shown in a square box along with the values that the variable can take. For example, The variable BatteryVoltage (which is the variable bv in the file carDiagnosis.py) can take on one of three different values: “strong”, “weak” and “dead”.

The numbers and bars show the unconditional probabilities of the variables taking on their differ-ent values. For example, in the Net we have P(BatteryVoltage = dead) = 0:41. For the various CPTs for the Net, see carDiagnosis.py.

1. (worth 5 points) Show a case of conditional independence in the Net where knowing some evidence item V 1 = d1 makes another evidence item V 2 = d2 irrelevant to the probability of some third variable V 3. (Note that conditional independence requires that the independence holds for all values of V 3).

1. (worth 5 points) Show a case of conditional independence in the Net where two variables are independent given NO evidence at a third variable yet dependent given evidence at the third variable.

 Assignment 4, University of Toronto, CSC384 – Intro to AI, Winter 2020 5
1. (worth 10 points) Show a sequence of accumulated evidence items V 1 = d1; :::; V k = dk (i.e., each evidence item in the sequence is added to the previous evidence items) such that each additional evidence item increases the probability that some variable V has the value d. (That is, the probability of V = d increases monotonically as we add evidence items). What is P(V = d—V1 = d1,…,Vk = dk)?

1. (worth 10 points) Show a sequence of accumulated evidence items V 1 = d1; :::; V k = dk (i.e., each evidence item in the sequence is added to the previous evidence items) such that each additional evidence item decreases the probability that some variable V has the value d. (That is, the probability of V = d decreases monotonically as we add evidence items). What is P(V = d—V1 = d1,…,Vk = dk)?

HAVE FUN and GOOD LUCK!