Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Midterm Exam for EECS 121: Information Theory and Probability, Spring 2000, Exams of Digital Communication Systems

The midterm exam for the university of california, berkeley, eecs 121 course on information theory and probability, taught by professor david tse in spring 2000. The exam covers topics such as prefix-free codes, kraft's inequality, huffman coding, gaussian processes, power spectral densities, dpcm quantization, and white noise modeling. Students are required to explain their answers carefully and the exam is worth 100 points.

Typology: Exams

2012/2013

Uploaded on 03/22/2013

gandik
gandik 🇮🇳

4.4

(7)

51 documents

1 / 4

Toggle sidebar

Related documents


Partial preview of the text

Download Midterm Exam for EECS 121: Information Theory and Probability, Spring 2000 and more Exams Digital Communication Systems in PDF only on Docsity! 1 of 4 Name: _______________________________ UNIVERSITY OF CALIFORNIA College of Engineering Department of Electrical Engineering and Computer Sciences Professor David Tse Spring 2000 EECS 121 — MIDTERM (7:00-9:00 p.m., 8 Wednesday 2000) Please explain your answers carefully. There are 100 total points and Question 4, part c) is a bonus. Problem 1 (30 points) [8 pts.] 1a)Argue that for any binary code satisfying the prefix-free condition, the codeword lengths must satisfy the Kraft’s inequality: . [6 pts.] b) Is it true that for any source, the codeword lengths for the binary Huffman code must satisfy Kraft’s inequality with equality? Explain. [6 pts.] c) Suppose now the coded symbols are from a general alphabet of size . The Kraft inequality becomes: . Is it true that the Huffman code must satisfy Kraft’s inequality with equality? Explain. [10 pts.] d) Consider a source for which the letter probabilities are of the form , where is an integer. Con- struct the Huffman code and give the corresponding codeword lengths. Justify that the code is opti- mal. li{ } 2 li– 1≤ i ∑ D D li– i ∑ 1≤ 2 k– k 2 of 4 Problem 2 (30 points) Let be a zero-mean WSS Gaussian process with autocorrelation function . [6 pts.] a) Find its power spectral density. [8 pts.] b) Suppose we sample this process every seconds. Is the resulting discrete-time process Gaussian? WSS? If so, compute its autocorrelation function. [8 pts.] c) Let be the sampled process. We perform DPCM quantization by LLSE prediction of from . Find the distribution of the residual error . [8 pts.] d) The residual error is quantized by a single bit quantizer to values . Find the optimal choice of as a function of . What happens when ? X t( ){ } Rx τ( ) e τ–= T Yn{ } Yn Yn 1– Yn Ŷn– ∆± ∆ T T 0→
Docsity logo



Copyright © 2024 Ladybird Srl - Via Leonardo da Vinci 16, 10126, Torino, Italy - VAT 10816460017 - All rights reserved