Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Probability and Statistics in Engineering I - Final Exam Solutions | IE 23000, Exams of Probability and Statistics

Material Type: Exam; Professor: Schmeiser; Class: Probability And Statistics In Engineering I; Subject: IE-Industrial Engineering; University: Purdue University - Main Campus; Term: Fall 2002;

Typology: Exams

Pre 2010

Uploaded on 07/30/2009

koofers-user-v6p-1
koofers-user-v6p-1 🇺🇸

10 documents

1 / 8

Toggle sidebar

Related documents


Partial preview of the text

Download Probability and Statistics in Engineering I - Final Exam Solutions | IE 23000 and more Exams Probability and Statistics in PDF only on Docsity! IE 230 — Probability & Statistics in Engineering I Name ___ < KEY > ___ Closed book and notes. No calculators. 120 minutes. Cover page, five pages of exam, and tables for discrete and continuous distributions. This is a 125-point exam. There is one point of writing your name neatly on each page. Xd ≡ Σi =1n Xi / n SX 2 ≡ Σi =1n (Xi − Xd)2 / (n − 1) = [Σi =1n Xi2 − n Xd 2 ] / (n − 1) E(W ) ≡ center of gravity of f W COV(X , Y ) ≡ E[ (X − E(X )) (Y − E(Y )) ] = E( XY ) − E(X ) E(Y ) V(X ) ≡ COV(X , X ) CORR(X , Y ) ≡ COV(X , Y ) / √ddddddddddV(X ) V(Y ) E(aX + bY ) = a E(X ) + b E(Y ) V(aX + bY ) = a 2 V(X ) + b 2 V(Y ) + 2ab COV(X , Y ) Score ___________________________ Final Exam (a), Fall 2002 Schmeiser IE 230 — Probability & Statistics in Engineering I Name ___ < KEY > ___ 1. True or false. (for each, 3 points if correct, 2 points if left blank.) (a) T ← F A random variable is a function that assigns a real number to every outcome of an experiment. (b) T ← F A statistic is a function of a random sample. (c) T ← F A statistic is a random variable. (d) T F ← Statistical inference makes conclusions about a sample given assumptions about a population. (e) T F ← Chebyshev’s Inequality is valid only for the normal distribution. (f) T F ← Method of Moments and Maximum Likelihood are both methods to determine the optimal way to take a random sample. (g) T ← F For any events A and B , it is true that (A ∪ B )′ = A′ ∩ B′ . (h) T ← F The geometric and exponential distributions both have the memoryless property. (i) T F ← The "bell curve" is the cdf of the normal distribution. (j) T F ← If random variables X and Y are independent, then Cov(X ,Y ) > 0. (k) T ← F The sum of five independent geometric random variables has a negative-binomial distribution. (l) T F ← The acronym "iid" means "individual and independent distribution". (m) T ← F If (X , Y ) has a bivariate-normal distribution, then the marginal distribution of X is normal. 2. (8 points) Consider any one of the T/F questions in Problem 1. A student will receive three points if the answer is correct and zero points if the answer is incorrect. If the question is left blank, the student will receive two points. Suppose that a student decides to answer a question only if his or her expected points is greater than two. Let C denote the event that the student’s answer is correct. Determine the values of P(C ) that will lead the student to answer the question. ____________________________________________________________ If no guess, then the expected number of points is two, with no uncertainty. If guess, the the expected number of points is 3P(C ) + 0(1 − P(C ))=3P(C), where C is the event that the guess is correct. Therefore, guessing has a greater expected value than not guessing if and only if 3P(C ) > 2, which is equivalent to 2 / 3 < P(C ) ≤ 1 ←. ____________________________________________________________ Final Exam (a), Fall 2002 Page 1 of 5 Schmeiser IE 230 — Probability & Statistics in Engineering I Name ___ < KEY > ___ 5. Let X denote the number of dots facing up when a fair six-sided die is tossed. What is the value of (a) (3 points) E(X 2) ____________________________________________________________ E(X 2) = Σall x x 2 f (x ) definition of expected value = 12 f (1) + 22 f (2) +...+ 62 f (6) the range of X is {1, 2,..., 6} = 12 (1 / 6) + 22 (1 / 6) +...+ 62 (1 / 6) X is discrete uniform = 91 / 6 simplify ____________________________________________________________ (b) (3 points) P(X = 1, X = 2) ____________________________________________________________ P(X = 1, X = 2) = P(X = 1) P(X = 2 | X = 1) multiplication rule = (1 / 6) (0) mutually exclusive events = 0←simplify Equivalently, P(X = 1, X = 2) = P(∅) = 0← ____________________________________________________________ (c) (3 points) V(X 2 | X = 2) ____________________________________________________________ The conditional distribution of X , given X = 2, is P(X = 2 | X = 2) = 1 and zero elsewhere. Therefore E(X 2 | X = 2) = 2 and V(X 2 | X = 2) = (2 − 2)2(1) = 0← Equivalently, given that X = 2, there is no uncertainty, so directly V(X 2 | X = 2) = 0←____________________________________________________________ 6. (6 points) Suppose that a random sample of three observations is obtained from a uniform distribution with lower bound 0 and upper bound 2. If the three independently chosen observations are x 1 = 1.2, x 2 = 1.3, x 3 = 1.8, what is the likelihood of the sample? (Recall: The likelihood is L = Πi =1 n f (xi ).) ____________________________________________________________ L = Πi =1 n f (xi ) definition of likelihood = f (x 1) f (x 2) f (x 3) here n = 3 = f (1.2) f (1.3) f (1.8) substitute sample values = (1 / 2) (1 / 2) (1 / 2) substitute sample values = 1 / 8← substitute sample values ____________________________________________________________ Final Exam (a), Fall 2002 Page 4 of 5 Schmeiser IE 230 — Probability & Statistics in Engineering I Name ___ < KEY > ___ 7. Suppose that traffic accidents occur according to a Poisson process with rate λ = 2 per hour. (a) (7 points) If there has been no accident in the previous twenty minutes, what the probability of there being exactly one accident in the next hour? ____________________________________________________________ Let N be the number of accidents in the next hour. Accidents occur according to a Poisson process, so N is Poisson with mean µ = λt = (2)(1) = 2 accidents. Then P(one accident) = P(N = 1) = 1! e−µ µ1hhhhhh = 2 e−2← ≈ 0.271 ____________________________________________________________ (b) (7 points) What is the expected time until three accidents occur? ____________________________________________________________ Let X denote the time until the third accident. Then X is Erlang with parameter values r = 3 and λ = 2. Therefore, E(X ) = r / λ = 3 / 2 hours ← ____________________________________________________________ 8. Two students are shooting (basketball) free throws. They each shoot ten times. To make the game more interesting they decide to pay each other $1 for each shot made. Suppose that the first student is a 40% free-throw shooter and that the second student is a 50% free-throw shooter. Let X 1 denote the dollars paid to the first shooter. Let X 2 denote the dollars paid to the second shooter. Then X = X 1 − X 2 is the (net) winnings of the first student. (a) (5 points) What is the value of E(X )? ____________________________________________________________ E(X ) = E(X 1 − X 2) definition of X = E(X 1) − E(X 2) expected value of a linear combination = n p 1 − n p 2 binomial means = (10)(0.4) − (10)(0.5) given values = −1 (dollars) simplify ____________________________________________________________ (b) (5 points) What is (approximately) the value of P(X > 0)? (i) 0 (ii) 0.05 (iii) 0.4← (iv) 0.5 (v) 0.6 (vi) 0.95 (vii) 1 Fall 2002 Page 5 of 5 Schmeiser IE 230 — Probability & Statistics in Engineering I Discrete-Distributions: Summary Table (from the Concise Notes) iiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiii random distribution range probability expected variance variable name mass function valueiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiii X general x 1, x 2, . . . , xn P(X = x ) i =1 Σ n xi f (xi ) i =1 Σ n (xi − µ) 2f (xi ) = f (x ) = µ = µX = σ 2 = σX 2 = f X (x ) = E(X ) = V(X ) = E(X 2) − µ2 iiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiii X discrete x 1, x 2, . . . , xn 1 / n i =1 Σ n xi / n [ i =1 Σ n xi 2 / n ] − µ2 uniform X equal-space x = a ,a +c ,...,b 1 / n 2 a +bhhhh 12 c 2 (n 2−1)hhhhhhhh uniform where n = (b −a +c ) / c "# successes in binomial x = 0, 1,..., n Cx n p x (1−p )n −x np np (1−p ) n Bernoulli trials" "# Bernoulli geometric x = 1, 2,... p (1−p )x −1 1 / p (1−p ) / p 2 trials until 1st success" "# Bernoulli negative x = r , r +1,... Cr −1 x −1 p r (1−p )x −r r / p r (1−p ) / p 2 trials until binomial r th success" "# successes in hyper- x = Cx K Cn −x N −K / Cn N np np (1−p ) (N −1) (N −n )hhhhhh a sample of geometric (n −(N −K ))+, size n from ..., min{K , n } where p = K / N a population (Sampling and of size N without integer containing replacement) k successes" "# of counts in Poisson x = 0, 1,... e−µ µx / x ! µ µ time t from a a Poisson process where µ = λt with rate λ"iii iiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiiii Fall 2002 Page 1 of 2 Schmeiser
Docsity logo



Copyright © 2024 Ladybird Srl - Via Leonardo da Vinci 16, 10126, Torino, Italy - VAT 10816460017 - All rights reserved