Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Error-Correcting Codes Lab 04, Lecture Slide - Engineering, Slides of Applied Mathematics

Prof. Daniel A. Spielman, Engineering, Prior, Extrinsic and Posterior Probabilities, Normalizing constants, symmetric channels, Decoding Codes, Parity, Applied Mathematics, Error Correcting Codes, Lab Exercise, Yale, MIT

Typology: Slides

2010/2011

Uploaded on 11/09/2011

mjforever
mjforever 🇺🇸

4.8

(25)

28 documents

1 / 4

Toggle sidebar

Related documents


Partial preview of the text

Download Error-Correcting Codes Lab 04, Lecture Slide - Engineering and more Slides Applied Mathematics in PDF only on Docsity! 18.413: Error-Correcting Codes Lab February 12, 2004 Lecture 4 Lecturer: Daniel A. Spielman 4.1 Prior, Extrinsic and Posterior Probabilities, II I should have given the classic example relating these probabilities: that of testing for a rare but deadly dissease. Let’s pretend that we have a test for Ebola that always says “yes” 95% of the time if you do have Ebola, but also has a 5% chance of saying “yes” even if you do not have it. Feeling sick, you go into the medical center, the resident in charge decides to test you for Ebola, and the test comes back “yes”. Should you presume that you have Ebola? In our language, we say that the extrinsic probability that you have Ebola given that the test says “yes’ is 0.95. On the other hand to figure out the actually probability that you have Ebola, the posterior probability, we would need to know the prior probability that you have Ebola. Assuming that you are a randomly chosen U.S. resident, and that only 1/2,000,000 U.S. residents contract Ebola each year, then the prior probability that you have Ebola is 1/2,000,000. So, the posterior probability that you have Ebola is 1/2000000× .95 1/2000000× .95 + 1999999/2000000× .05 = .00000949. You are quite releived to observed that there is approximately a 1/100000 chance that you have Ebola, Of course, if you are not a randomly chosen U.S. resident, and just happened to return from a month of living in a jungle with monkeys, the prior probability will not be as good. 4.2 Normalizing constants We will soon tire of complicated derivations involving multiple applications of the law of conditional probability. To simplify our calculations and notation, we make the following observation: if x is uniformly chosen from an alphabet A and y is a random variable depending on x, then we have P [x = a|y = b] = P [y = b|x = a] P [x = a] P [y = b] . While we don’t know the probability that y = b, it does only depend on b. Since x is chosen uniformly, P [x = a] does not depend on a. So, we will set cb = P [x = a] P [y = b] , 4-1 Lecture 4: February 12, 2004 4-2 and write P [x = a|y = b] = cbP [y = b|x = a] . This will be fine if all the cb terms cancel out in our calculations (which they always will). In fact, if it is clear that these terms will cancel, then we will just write P [x = a|y = b] ∼ P [y = b|x = a] , and skip the cb altogether. 4.3 Example: symmetric channels For example, let C be a symmetric channel that has input alphabet {0, 1} and some output alphabet B. Recall that the definition of a symmetric channel implies that for each b ∈ B there is another symbol b′ ∈ B such that P [rec b|sent 0] = P [ rec b′|sent 1 ] and P [rec b|sent 1] = P [ rec b′|sent 0 ] . Let p denote P [sent 1|rec b]. Then P [sent 1|rec b] = 1− p, and we can write P [rec b|sent 1] = cb · p, P [rec b|sent 0] = cb · (1− p). This constant cb has an interpretation: any symmetric two-input channel can be described as a distribution over binary symmetric channels. That is, it is equivalent to choosing a set of crossover probabilities, p1, . . . , pk, and then assigning to each induced channel BSCpi a probability qi, where∑k i=1 qi = 1. When an input is fed to the channel, the channel first chooses an i according to the distribution ~q, and then passes the bit through BSCpi . If we are being nice to the detector (and we will be), then the channel will output both a 0 or 1 and the chosen value of pi, which provides a reliability measure for the output. This constant cb is exactly qi for this crossover probability. 4.4 Decoding Codes We now turn to the problem of decoding general error-correcting codes. We will see that there are two different ideal ways of doing this. Let C ⊆ {0, 1}n be a code. Assume that we choose a random codeword (a1, . . . , an) ∈ C, send it over a symmetric channel, and receive a vector (b1, . . . , bn). Moreover, know pi = P [xi = 1|rec bi] .
Docsity logo



Copyright © 2024 Ladybird Srl - Via Leonardo da Vinci 16, 10126, Torino, Italy - VAT 10816460017 - All rights reserved