Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Final Exam with Answers - Probability and Statistics in Engineering I | IE 23000, Exams of Probability and Statistics

Material Type: Exam; Professor: Schmeiser; Class: Probability And Statistics In Engineering I; Subject: IE-Industrial Engineering; University: Purdue University - Main Campus; Term: Fall 2006;

Typology: Exams

Pre 2010

Uploaded on 07/31/2009

koofers-user-s6f-1
koofers-user-s6f-1 🇺🇸

10 documents

1 / 6

Toggle sidebar

Related documents


Partial preview of the text

Download Final Exam with Answers - Probability and Statistics in Engineering I | IE 23000 and more Exams Probability and Statistics in PDF only on Docsity! IE 230 Seat # ________ (1 pt) Name ___ < KEY > ___ Closed book and notes. 120 minutes. Cover page, five pages of exam. No calculators. Score ___________________________ Final Exam, Fall 2006 (Dec. 16) Schmeiser IE 230 — Probability & Statistics in Engineering I (1 pt) Name ___ < KEY > ___ Closed book and notes. 120 minutes. Consider an experiment that chooses a random sample X 1, X 2, . . . , Xn from a population. Suppose that the observations are independent and identically distributed with cdf F , mean E(X ) = µ, variance V(X ) = σ2 and median x 0.5 = F −1(0.5). Let Xd denote the sample mean, S 2 = [Σi =1 n (Xi − Xd) 2] / (n − 1) denote the sample variance, and X (i ) the i th order statistic. 1. True or false. (2 points each) (a) T ← F E(Xdd ) = E(X ). (b) T ← F E(S 2) = σ2. (c) T F ← E(Xdd ) = Xd . (d) T F ← V(Xd) = σ2 / √ddn . (e) T F ← ste(Xd) = σ. (f) T F ← X (n ) = min{X 1, X 2, . . . , Xn } (g) T ← F The maximum-likelihood estimator determines the distribution that makes the observed sample most likely. (h) T ← F If the observations are from a normal distribution, then for every sample size n , the sample mean Xd has a normal distribution. (i) T F ← The bias of S as a point estimator of σ is Bias[S ,σ] = [E(S ) − σ]2. (j) T ← F The empirical cdf is obtained by creating a scatter plot of the i th observation, x (i ), with i / (n + 1) for i = 1, 2,..., n . 2. (2 points each) Fill in the blanks with course jargon (a word or phrase). (Here Θ̂ is a point estimator of θ.) (a) A subset of the sample space: __ < event > __. (b) A function that assigns a real number to every outcome of the experiment: __ < random variable > __. (c) A function of a sample from a population: __ < statistic > __. (d) A single number, computed from a sample, used to estimate an unknown parameter of the population: __ < point estimate >__. (e) E[(Θ̂ − θ)2]: __ < mean squared error > __. Final Exam, Fall 2006 (Dec. 16) Page 1 of 5 Schmeiser IE 230 — Probability & Statistics in Engineering I (1 pt) Name ___ < KEY > ___ 5. For i = 1, 2,..., n , let Xi = 1 if Bernoulli trial i is a success and zero otherwise. Let p denote the probability of success for each trial. (a) (3 points) Is each of the Xi s an unbiased estimator for p ? yes ← no (b) (6 points) Explain your answer to Part (a). (Ideally, provide a proof. A word argument might be adequate.) ____________________________________________________________ E(Xi ) = Σall x x f X (x ) = (0) (1 − p ) + (1) (p ) = p Therefore, bias(Xi , p ) = E(Xi ) − p = p − p = 0. ____________________________________________________________ 6. (3 points each) Suppose that p̂ = Σi =1 n Xi / n , where the Xi s are as defined in Problem 5. Fill in each blank using one of the following terms: (i) definition of "_______", (ii) independence, (iii) total probability, (iv) Problem 5, (v) mutually exclusive, (vi) standard error, (vii) probability result with no assumption needed, (viii) algebra. (If answering with (i), definition of what? You can use an answer more than one time.) E(p̂ ) = E(Σi =1 n Xi / n ) __ < (i) definition of p̂ > __ = E[(1 / n ) X 1 + (1 / n ) X 2 +...+ (1 / n ) Xn ] __ < (viii) algebra > __ = (1 / n ) E(X 1) + (1 / n ) E(X 2) +...+ (1 / n ) E(Xn ) __ < (vii) always true > __ = (1 / n ) p + (1 / n ) p +...+ (1 / n ) p __ < (iv) Problem 5 >__ = p __ < (viii) algebra > __ (3 points) T ← F p̂ is an unbiased estimator of p , for every positive integer n . Final Exam, Fall 2006 (Dec. 16) Page 4 of 5 Schmeiser IE 230 — Probability & Statistics in Engineering I (1 pt) Name ___ < KEY > ___ 7. Suppose that X 1, X 2,..., X 10 are independent and identically distributed with the continuous normal distribution with mean µ = 10 and variance σ2 = 1. (8 points) Sketch the pdf of the sample mean. (Label and scale all axes.) ____________________________________________________________ The distribution Xd , the sample mean of normal random variables, is normal. The mean is E(Xd) = E(X ) = 10. The variance is V(Xd) = V(X ) / n = 1 / 10. Therefore the standard deviation is 1 / √dd10. Sketch the usual bell curve. Label the horizontal axis with any dummy variable, say x . Scale the axis with any two numbers, probably including µ = 10. Label the vertical axis with f Xd (x ). Scale the axis with any two numbers, probably including zero. ____________________________________________________________ Final Exam, Fall 2006 (Dec. 16) Page 5 of 5 Schmeiser
Docsity logo



Copyright © 2024 Ladybird Srl - Via Leonardo da Vinci 16, 10126, Torino, Italy - VAT 10816460017 - All rights reserved