Download university of illinois at urbana-champaign and more Exams Probability and Statistics in PDF only on Docsity! UNIVERSITY OF ILLINOIS AT URBANA-CHAMPAIGN Department of Electrical and Computer Engineering CS 440/ECE 448 Artificial Intelligence Spring 2020 EXAM 2 SOLUTIONS Monday, March 30, 2020 Problem 1, Version 1 (5 points) Consider three binary events, A, B, and C, with probabilities given by P (A) = 0.7, P (B) = 0.4, and P (C) = 0.3. (a) What’s the largest possible P (B ∧ C)? Solution: maxP (B ∧ C) = min (P (B), P (C)) = 0.3 (b) If A and B are independent, what’s P (A ∧B)? Solution: P (A)P (B) = 0.28 Problem 1, Version 2 (5 points) Consider three binary events, A, B, and C, with probabilities given by P (A) = 0.7, P (B) = 0.4, and P (C) = 0.3. (a) What’s the smallest possible P (A ∧B)? Solution: minP (A ∧B) = 1−maxP (¬(A ∧B)) = 1−max (P (¬A ∨ ¬B)) = 1− P (¬A)− P (¬B) = 1− 0.3− 0.6 = 0.1 (b) If A and B are independent, what’s P (A ∨B)? Solution: P (A ∨B) = P (A) + P (B)− P (A ∧B) = 0.7 + 0.4− (0.7)(0.4) = 1.1− 0.28 = 0.82 NAME: Exam 2 Solutions Page 2 Problem 1, Version 3 (5 points) Consider three binary events, A, B, and C, with probabilities given by P (A) = 0.7, P (B) = 0.4, and P (C) = 0.3. (a) What’s the largest possible P (A ∨B)? Solution: P (A ∨B) = P (A) + P (B)− P (A ∧B) ≤ P (A) + P (B) = 1.1 Since 1.1 > 1, we conclude that maxP (A ∨B) = 1. (b) If A and B are independent, what’s P (A ∧ ¬B)? Solution: P (A ∧ ¬B) = P (A)P (¬B) = (0.7)(0.6) = 0.42 Problem 2, Version 1 (5 points) Consider three binary events, A, B, and C, with probabilities given by P (A) = 0.7, P (B) = 0.4, and P (C) = 0.3. (a) If B and C are mutually exclusive, what’s P (B ∧ (¬C))? Solution: If B and C are mutually exclusive, then P (B ∧ C) = 0, so P (B ∧ (¬C)) = P (B) = 0.4 (b) What’s the largest possible P (¬(B ∧ C))? Solution: ¬(B ∧ C) is the same as (¬B ∨ ¬C). P (¬B ∨ ¬C) = P (¬B) + P (¬C)− P (¬B ∧ ¬C) ≤ P (¬B) + P (¬C) = 1.3 Since 1.3 > 1, we conclude that the largest possible P (¬(B ∧ C)) = 1, i.e., this would occur if B and C are mutually exclusive. NAME: Exam 2 Solutions Page 5 The article is only three words long; it contains the words A = (W1 = 2,W2 = 1,W3 = 0) What is P (Y = 1, A)? Solution: P (Y = 1, A) = P (Y = 1)P (W1 = 2|Y = 1)P (W2 = 1|Y = 1)P (W3 = 0|Y = 1) = (0.4)(0.1)(0.4)(0.4) Problem 4, Version 1 (5 points) Consider the following Bayes network (all variables are binary): A B C P (A) = 0.4, P (B) = 0.1 A,B P (C|A,B) False,False 0.7 False,True 0.7 True,False 0.1 True,True 0.9 (a) What is P (C)? Write your answer in numerical form, but you don’t need to simplify. Solution: P (C) = P (¬A,¬B,C) + P (¬A,B,C) + P (A,¬B,C) + P (A,B,C) = (0.6)(0.9)(0.7) + (0.6)(0.1)(0.7) + (0.4)(0.9)(0.1) + (0.4)(0.1)(0.9) (b) What is P (A|B = True, C = True)? Write your answer in numerical form, but you don’t need to simplify. Solution: P (A|B,C) = P (A,B,C) P (A,B,C) + P (¬A,B,C) = (0.4)(0.1)(0.9) (0.4)(0.1)(0.9) + (0.6)(0.1)(0.7) NAME: Exam 2 Solutions Page 6 Problem 4, Version 2 (5 points) Consider the following Bayes network (all variables are binary): A B C P (C) = 0.1 C P (A|C) P (B|C) False 0.8 0.7 True 0.4 0.7 (a) What is P (A)? Write your answer in numerical form, but you don’t need to simplify. Solution: P (A) = P (¬C,A) + P (C,A) = (0.9)(0.8) + (0.1)(0.4) (b) What is P (C|A = True,B = True)? Write your answer in numerical form, but you don’t need to simplify. Solution: P (C|A,B) = P (A,B,C) P (A,B,C) + P (A,B,¬C) = (0.1)(0.4)(0.7) (0.1)(0.4)(0.7) + (0.9)(0.8)(0.7) Problem 4, Version 3 (5 points) Consider the following Bayes network (all variables are binary): A B C P (A) = 0.8 A P (B|A) False 0.7 True 0.3 B P (C|B) False 0.5 True 0.7 NAME: Exam 2 Solutions Page 7 (a) What is P (C)? Write your answer in numerical form, but you don’t need to simplify. Solution: P (C) = P (¬A,¬B,C) + P (¬A,B,C) + P (A,¬B,C) + P (A,B,C) = (0.2)(0.3)(0.5) + (0.2)(0.7)(0.7) + (0.8)(0.7)(0.5) + (0.8)(0.3)(0.7) (b) What is P (A|B = True, C = True)? Write your answer in numerical form, but you don’t need to simplify. Solution: P (A|B,C) = P (A,B,C) P (A,B,C) + P (¬A,B,C) = (0.8)(0.3)(0.7) (0.8)(0.3)(0.7) + (0.2)(0.7)(0.7) Problem 5, Version 1 (5 points) Consider the following Bayes network (all variables are binary): A B C You’ve been asked to re-estimate the parameters of the network based on the following obser- vations: Observation A B C 1 True False False 2 False False True 3 True True False 4 False False False (a) Given the data in the table, what are the maximum likelihood estimates of the model parameters? If there is a model parameter that cannot be estimated from these data, mark it “UNKNOWN.” Solution: P (A) = 2/4, P (B) = 1/4, and A,B P (C|A,B) F, F 1/2 F, T UNKNOWN T, F 0/1 T, T 0/1 NAME: Exam 2 Solutions Page 10 A P (B|A) False 1/4 True 2/4 B P (C|B) False 2/5 True 1/3 Problem 6, Version 1 (5 points) Consider the following probabilistic context-free grammar: S→ NP VP P = 1.0 NP→ N P = 0.9 NP→ J N P = 0.1 VP→ V P = 0.3 VP→ V NP P = 0.7 J→ beautiful P = 0.4 J→ complicated P = 0.6 N→ birds P = 0.8 N→ flowers P = 0.2 V→ enjoy P = 0.5 V→ grow P = 0.5 Consider the sentence “Complicated flowers enjoy birds.” What is the probability that this sentence would be generated by the grammar shown above? (Ignore capitalization and punctuation.) Solution:The rules that fire are S→ NP VP P = 1.0 NP→ J N P = 0.1 J→ complicated P = 0.6 N→ flowers P = 0.2 VP→ V NP P = 0.7 V→ enjoy P = 0.5 NP→ N P = 0.9 N→ birds P = 0.8 The product of these probabilities is P = (1.0)(0.1)(0.6)(0.2)(0.7)(0.5)(0.9)(0.8) NAME: Exam 2 Solutions Page 11 Problem 6, Version 2 (5 points) Consider the following probabilistic context-free grammar: S→ NP VP P = 1.0 NP→ N P = 0.9 NP→ J N P = 0.1 VP→ V P = 0.3 VP→ V NP P = 0.7 J→ beautiful P = 0.4 J→ complicated P = 0.6 N→ birds P = 0.8 N→ flowers P = 0.2 V→ enjoy P = 0.5 V→ grow P = 0.5 Consider the sentence “Birds grow flowers.” What is the probability that this sentence would be generated by the grammar shown above? (Ignore capitalization and punctuation.) Solution:The rules that fire are S→ NP VP P = 1.0 NP→ N P = 0.9 N→ birds P = 0.8 VP→ V NP P = 0.7 V→ grow P = 0.5 NP→ N P = 0.9 N→ flowers P = 0.2 The product of these probabilities is P = (1.0)(0.9)(0.8)(0.7)(0.5)(0.9)(0.2) Problem 6, Version 3 (5 points) Consider the following probabilistic context-free grammar: NAME: Exam 2 Solutions Page 12 S→ NP VP P = 1.0 NP→ N P = 0.9 NP→ J N P = 0.1 VP→ V P = 0.3 VP→ V NP P = 0.7 J→ beautiful P = 0.4 J→ complicated P = 0.6 N→ birds P = 0.8 N→ flowers P = 0.2 V→ enjoy P = 0.5 V→ grow P = 0.5 Consider the sentence “Flowers grow complicated flowers.” What is the probability that this sentence would be generated by the grammar shown above? (Ignore capitalization and punctuation.) Solution:The rules that fire are S→ NP VP P = 1.0 NP→ N P = 0.9 N→ flowers P = 0.2 VP→ V NP P = 0.7 V→ grow P = 0.5 NP→ J N P = 0.1 J→ complicated P = 0.6 N→ flowers P = 0.2 The product of these probabilities is P = (1.0)(0.9)(0.2)(0.7)(0.5)(0.1)(0.6)(0.2) Problem 7, Version 1 (5 points) A particular hidden Markov model (HMM) has state variable Xt, and observation variables Et, where t denotes time. Suppose that this HMM has two states, Xt ∈ {0, 1}, and three possible observations, Et ∈ {0, 1, 2}. The initial state probability is P (X1 = 1) = 0.3. The transition and observation probability matrices are Xt−1 P (Xt = 1|Xt−1) Xt P (Et = 0|Xt) P (Et = 1|Xt) 0 0.6 0 0.4 0.1 1 0.4 1 0.1 0.6 Suppose that, in a particular test of the HMM, the observation sequence is {E1, E2} = {2, 1}