Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

M. Phil. in Statistical Science Exam: Information and Coding, Exams of Statistics

A past exam paper from an m. Phil. In statistical science program, focusing on information and coding. It includes three questions related to entropy, conditional entropy, and coding theory. Students are required to use concepts such as gibbs inequality, poisson random variable, entropy power inequality, and hamming and gilbert-varshamov bounds.

Typology: Exams

2012/2013

Uploaded on 02/26/2013

dharmaraaj
dharmaraaj 🇮🇳

4.4

(63)

150 documents

1 / 3

Toggle sidebar

Related documents


Partial preview of the text

Download M. Phil. in Statistical Science Exam: Information and Coding and more Exams Statistics in PDF only on Docsity! M. PHIL. IN STATISTICAL SCIENCE Thursday 8 June, 2006 9 to 11 INFORMATION AND CODING Attempt THREE questions. There are FOUR questions in total. Marks for each question are indicated on the paper in square brackets. Each question is worth a total of 20 marks. STATIONERY REQUIREMENTS SPECIAL REQUIREMENTS Cover sheet None Treasury Tag Script paper You may not start to read the questions printed on the subsequent pages until instructed to do so by the Invigilator. 2 1 (a) Consider two discrete random variables X and Y . Define the conditional entropy h(X|Y ), and show that it satisfies h(X|Y ) 6 h(X), giving necessary and sufficient conditions for equality. You may assume the Gibbs inequality, provided that you state it carefully. [8] (b) Consider two discrete random variables U and V with corresponding probability mass functions pU and pV . For each α ∈ [0, 1], define the mixture random variable W (α) by its mass function pW (α)(x) = αpU (x) + (1− α)pV (x). Prove that for all α the entropy of W (α) satisfies: h(W (α)) > αh(U) + (1− α)h(V ). [6] (c) Define F (λ) to be the entropy of a Poisson random variable with mean λ > 0. Show that F (λ) is a non-decreasing function of λ > 0. [6] 2 State the Entropy Power Inequality for n-dimensional random vectors. [4] Let X be a real-valued random variable with a density and finite differential entropy, and let function g : R → R have strictly positive derivative g′ everywhere. Prove that the random variable g(X) has differential entropy satisfying h(g(X)) = h(X) + E log2 g′(X), assuming that E log2 g′(X) is finite. [7] Let Y1 and Y2 be independent, strictly positive random variables with densities. Show that the differential entropy of the product Y1Y2 satisfies 22h(Y1Y2) > α122h(Y1) + α222h(Y2), where log2(α1) = 2E log2 Y2 and log2(α2) = 2E log2 Y1. [9] Information and Coding
Docsity logo



Copyright © 2024 Ladybird Srl - Via Leonardo da Vinci 16, 10126, Torino, Italy - VAT 10816460017 - All rights reserved