Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

M. Phil. in Statistical Science: Information and Coding Exam Questions, Exams of Statistics

The questions for the m. Phil. In statistical science information and coding exam, held on june 1, 2007. The questions cover topics such as huffman coding, information theory, and cyclic codes. Students are required to answer three out of four questions, which carry equal weight. The document also includes hints and solutions for some of the questions.

Typology: Exams

2012/2013

Uploaded on 02/26/2013

dharitree
dharitree 🇮🇳

4.5

(2)

63 documents

1 / 3

Toggle sidebar

Related documents


Partial preview of the text

Download M. Phil. in Statistical Science: Information and Coding Exam Questions and more Exams Statistics in PDF only on Docsity! M. PHIL. IN STATISTICAL SCIENCE Friday 1 June 2007 9.00 to 11.00 INFORMATION AND CODING Attempt THREE questions. There are FOUR questions in total. The questions carry equal weight. STATIONERY REQUIREMENTS SPECIAL REQUIREMENTS Cover sheet None Treasury Tag Script paper You may not start to read the questions printed on the subsequent pages until instructed to do so by the Invigilator. 2 1 Consider an alphabet with m letters each of which appears with probability 1/m. A binary Huffman code is used to encode the letters, in order to minimise the expected codeword-length (s1 + . . . + sm)/m where si is the length of the codeword assigned to letter i. Set s = max[si : 1 6 i 6 m], and let n` be the number of codewords of length `. (a) Show that 2 6 ns 6 m. (b) For what values of m is ns = m? (c) Determine s in terms of m. [Hint: You may find it useful to write m = a2k where 1 6 a < 2.] (d) Prove that ns−1 + ns = m, i.e. any two codeword-lengths differ by at most 1. (e) Determine ns−1 and ns. (f) Describe the codeword-lengths for an idealised model of English (with m = 27). 2 Consider an information source emitting a sequence of letters (Un) which are inde- pendent identically distributed random variables taking values 1, . . ., m with probabilities p1, . . ., pm. Let u(n) = (u1, . . . , un) denote a sample string of length n from the source. Given 0 <  < 1, let M(n, ) denote the minimal size of a set of strings u(n) of total probability at least 1− . Show the existence of the limit lim n→∞ 1 n log2 M(n, ) and determine its value. Comment on the significance of this result for coding theory. 3 State and prove the Hamming and Gilbert–Varshamov bounds for codes. State and prove the corresponding asymptotic bounds. Information and Coding
Docsity logo



Copyright © 2024 Ladybird Srl - Via Leonardo da Vinci 16, 10126, Torino, Italy - VAT 10816460017 - All rights reserved