Download Exam Schedule and Topics for Coding Theory Course and more Slides Digital Communication Systems in PDF only on Docsity! Announcements Quiz # 7 will be held on Friday 19th December in the class and the Makeup Quiz # 1 will take place on the same day in the evening time 5:00pm Quiz # 8 will be held on Friday 26th December in the class and the Makeup Quiz # 2 will take place on the same day in the evening time 5:00pm docsity.com Agenda Structured Sequences Linear Block Codes Examples Error Detecting and Correcting Capabilities Hamming Code Example Convolutional Codes Quiz docsity.com
Automatic Repeat Request
» ARQ vs. FEC
a ARQ is much simpler than FEC and need no redundancy.
a ARQ is sometimes not possible if
= Areverse channel is not available or the delay with ARQ
would be excessive
= The retransmission strategy is not conveniently implemented
= The expected number of errors, without corrections, would
require excessive retransmissions
® docsity.com
Transmitter 1 2 3 3 4 5 5
; ee u* a 7 ors <F
Transmission & & = oy = x
Receiver 1 2 34] 3 4 Ps 5
Error Error
(a)
Transmitter 1,;2/3)/4/5/)6)/7/8/4)/5/]e6]7;8]3 }10 8/9 )10
_— ~~ AOC SIE III IVES “ek
Transmission OS PPE LESEEEEEEE a e CEE
Receiver 1f2}3bkadsfel[7[ela]s]efrqe nuj7is
1 1
Error Error
(b}
Transmitter T)}2)/3/4/)5/6)/7 [8] 4] 9 }10}11 11/16/17) 18
. * ee wo O wm eH
a ~~ Ae ete Moke he Fe ke He, - ae
Transmission wo ee eS oF ey ee oe we oe ee
+
Receiver 4 2/3 bay 5/6/7])s8s])4]9 14)15/)911/16
Error
Figure 6.7: Automatic Repeat Request (ARQ) (a) Stop and wait ARQ (b)
tinuous ARQ with pullback (c) Continuous ARQ with selective repeat
docsity.com
6.3 Structured Sequences
= Block codes
= Convolutional codes
= Turbo codes
6.3.2 Code Rate and Redundancy
= Incase of block codes, encoder transforms each k-bit data block
into a larger block of n-bits called code bits or or channel symbol
= The (n-k)-bits added to each data block are called redundant bits,
parity bits or check bits
= Ratio of redundant bits to data bits: (n-k)/k is called redundancy of
code
= Ratio of data bits to total bits, k/n is called code rate
Channel
Data block Codeword
encoder
SS eSSSSSSSSSSSSSSSSsSA SSeS
k bits n bits
® docsity.com
6.4 Linear block Codes
6.4.1 Vector Spaces
= The set {0,1}, under modulo 2 binary addition and
multiplication forms a field.
Addition Multiplication
0®80=0 0-0=0
0@l=1 0-1=0
180=1 1-0=0
1@1=0 ll=1
® docsity.com
Some definitions — cont’d
= Examples of vector spaces
a The set of binary n-tuples, denoted by
V, = {(0000), (0001), (0010), (0011), (0100), (0101),(0111),
(1000), (1001), (1010), (101 1),(1100),(1101),(1111)}
= Vector subspace:
a A subset S of the vector space V’,is called a subspace if:
un The all-zero vector is in S.
= The sum of any two vectors in S is also in S (Closure
Property). Example:
{(0000), (0101), (L010), (1 111)} isasubspaceof V,.
a These are fundamental properties of Linear Block
Codes
® docsity.com
= The subset chosen for the code should include as many as
elements to reduce the redundancy but they should be as apart as
possible to maintain good error performances
2" n-tuples constitute
the entire space V,,
TS
2% n-tuples constitute
the subspace of codewords
Linear block-code structure
® docsity.com
Example:
= Let the generator matrix be:
V, 1 1 0 1 0 0
G=|V,/=|}0 1 1 0 1 0
V; 1 01 00 1
=n Where V,,V,and V, are linearly independent vectors that can
generate all fhe code vectors
= The sum of any two generating vectors does not yield any of the
other generating vectors
= Generate Codeword U4 for the fourth message vector 1 1 0 in
Table 6.1
U, =[1 1 0] V=V,+ V,+ 0*V,
= 110100 + 011010 + 000000
=101110 ( Codeword for the message vector 110)
® docsity.com
6.4.5 Systematic Linear Block codes
= Asystematic (n,k) linear block code is a mapping a k-dimensional
message vector to an n-dimensional code word such that part of the
sequence has k message digits and remaining (n-k) are parity digits
= Asystematic linear code will have a generator matrix
Pll PI2 - Pl(w-ky) 1 O 1 0
; P21 P22 Pr(n-k) O 1 ov 0
G-|P : & |= 21 P22 2,(n-k)
Pel Pk2 °* Pk(n-k) 9 Ov 1
nxk
I, =kxk identity matrix
P, =kx(n—k) matrix
= Combining
Pu Po 7 Praew 1 0
_ Px Px "Pawar
U,,U,,...U, =[M,,m,,...m,]x]-,
Pu Pr 7 Pra-r 00 -- 1
® docsity.com
Where
u, =mM,p,+ M,p,;+ ....M,p,; for i=1,...(n-k)
= Max for i=(n-k+1)....n
=u The systematic code vector can be expressed as:
U = Py; Poseees Pn, M5 5--+ My
parity — bits message _ bits
Py = MM, Py FM, Py Fe FM; P Hy
Py =M Pix TM, Py Te FM, Py
Prk = Pyne) FM Po (nny Ft FIM, Penk)
&
docsity.com
6.4.7 Syndrome Testing
It is easy to verify from here:
UH" = py + Diy Py + Pasee-Pn¢ + Pre = 0
Where U is a code word generated by matrix G iff UH™=0
Let r be received vector (one of 2" n-tuples) where U vector (one of
2« n-tuples) was transmitted :
r=Ut+e
=m The syndrome of r is defined as:
S =rH™=(U+e)H™T=UH™+eH"
S=eH™
‘Daca souree| + Format m | Channel U, Modulation
encoding
channel
Channel |,___Demodulation,
Data sink Format anne eme “a 70 _Y
m decoding r Detection
[
docsity.com
= Requirements of the parity-check matrix
a Nocolumn of H can be all zeros, or else an error in the
corresponding codeword position would not affect the syndrome
and would be undetectable
a All columns of H must be unique. If two columns of H were
identical, errors in these two corresponding codeword positions
would be indistinguishable
Example
= CodewordU=101110,andr=001 110 Find S=rH"
S=rH 7
l
0
0
=fo01110]
l
0
l
i] =
= fl, 141, 1+
® docsity.com
S=eH'’
=[1 0 0 0 0 ojA’
=[1 0 ol]
mMm™ OFF OH OO
ia a)
6.4.8 Error Correction
= Arranging 2" n-tuples; representing possible received vectors, in an
array is called standard array. Standard array for (n,k) code is:
Uy op U; vs Uy.
zero
codeword ] ©2 Uz +e) U; +e wee Uy +e74
coset
&3 Un + 23 U; +63 occ Un. +83
C7 U> +e; Uj; +e; ot Us. +e;
et leaders Conk U2 + Conk U; + Can-k Uns + i) aa
= Each row, called a coset consists of an error pattern in the first
column called coset leader
= If error pattern is not a coset leader, erroneous decoding will result
&
docsity.com
U =(101110) transmitted.
Syndrome
r=(001110) is received.
000000 sagncc cons gecc asses procnncna
0007 The syndrome of ris computed :
000010 S =rH’ = (001110)H* =(100)
000700 Error pattern corresponding to this syndrome is
_ é = (100000)
010000
100000 The corrected vector is estimated
o_o U =r+é = (001110) +(100000) = (101110)
Syndrome lookup
Table
® docsity.com
Decoder implementation
= The received signal is multiplied with the parity check matrix:
S=rH!
l
0
/ 0
S = [77,73 1%, 15 ¥o | ;
0
l
and
s, =["r + 7, + 76)
s, =[r, +7, + ¥5 ]
s, =[r; +r, + re]
or FE Oo Fr Oo
eS Fe Ore COC SF
docsity.com
Implementation of the (6,3) decoder
Received
— r Pr; —
vector r ' 2 "3 my ra a | 6
Exclusive-OR
ates
Syndrome S 33 g
+ —+
Error AND gates
patterne
@5 eg
Received
vector r rs rg
Corrected us us
output U
docsity.com
6.8 Well-Anown Block Codes
6.8.1 Hamming Codes
Simple class of block codes characterized by the structure:
(n,k) = (2” -1,2” -1-m)
Where m=2, 3, ...... These codes have a minimum distance of 3 and
are capable of correcting single errors
docsity.com
Example: Hamming Codes
= Parameters of (n,k) linear block codes:
a Block length: n=2™-1
a Number of message bits: k=2™-m-1
a Number of parity bits
= Consider Hamming code with n=7 and k=4 (7,4) corresponding to
m=3
= Generator Matrix:
QD
ll
—_ — CG —_—
® docsity.com
Message | Code Weight of |Message | Code Weight of
Word Word Code word | Word Word Code word
0000 0000000 |0 1000 1101000 |3
0001 1010001 |3 1001 0111001 |4
0010 1110010 |4 1010 0011010 |3
0011 0100011 |3 1011 1001011 | 4
0100 0110100 |3 1100 1011100 | 4
0101 1100101 |4 1101 0001101 |3
0110 1000110 |3 1110 0101110 |4
0111 0010111 |4 1111 1111111 |7
= Corresponding parity check Matrix 10 0; 1011
A7=|0 1 0] 1 1 1 £0
00 1], 011 1
—-—~ —_——~’
Lk pe
® docsity.com
7.1 CONVOLUTIONAL ENCODING
= A convolutional code is described by three integers, n, k, and K
where the ratio k/n is called the rate of the code
= The integer K is constraint length; it represents number of k-tuple
stages in the encoding shift register.
=» Encoder has memory—the n-tuple emitted by the convolutional
encoding procedure is not only a function of an input k-tuple, but is
also a function of the previous K-7 input k-tuples
®) docsity.com
Block Diagram of a Typical Communication Link
Information Convolutional Modulate
source 4 encode 4
mM =m}, M2, ... , Mj, «.. U = G(m) {s;(t)}
Input sequence = Uj, U2, ... , Uj, ...
Codeword sequence AWGN
where Uj = u4j, ... , Uji, ++. Uni channel
Information Convolutional
sink FT decode FP Demodulate aN
M = 774, M2, ..., Mj, ... Z=2Z4, Zo, ...,Zj, ... ($:(£)}
where Z; =24;, ... »Zjis os Eni
and 2; is the jth demodulator output
symbol of branch word Z;
re 7.1: Encode/decode and modulate/demodulate portions of a
communication link
docsity.com
7.2 Convolutional Encoder Representation
mM =m, MQ, ... , Mj; «.. 12 3... RK wee
Inputsequence = —> a ~stage
(shifted in & at a time) shit register
n modulo-2
Mag 2M ote adders
Codeword sequence U = Uj, Uo, ... , Uj, ...
where U; = Uis very Ujiy ore Uni
= ith codeword branch
uj, = jth binary code symbol
of branch word U;
(Be 7.2: Convolutional encoder with constraint length K and rate k/n
docsity.com
7.2.1.2 Polynomial Representation
= Convolutional encoder maybe represented with a set of n-generator
polynomials, one for each modulo-2 adders
=» Continuing with the same example, we can write the generator
polynomial for upper connections g,(X) and g,(X) for lower
connections: e(X)=14X4X?
ge (X)=14+X°
= U(X) is the output sequence
U(X)=m(X)g,(X) interlaced with m(X¥)g,(X)
= Where m = 107,encoder can be found as:
® docsity.com
m(X)g(X) = (1+ X° 4+ X 4+. X°) = 14 X + XP 4X7
m(x)g,(X) = (1+ X7)\d+X°) =14+X"7
mX)g(X)=l+ X+0X7 + X° +X"
m(X)g,(X) =1+0X +0X*+0X° +X"
U(X) =(L) + (,0)X + (0,0).X° + (,0).X° + (LDX*
U=l1!l1 10 00 +10 411
docsity.com
Encoder representation
= Impulse response representaiton:
a The response of encoder to a single “one” bit that goes through it.
= Example:
. Branch word
Register =|
contents uy Us
100 1 1
Inputsequence: | 0 O O10 10
Output sequence: 11 10 11
~ oo nee eee eee -- 001 4d
Input m Output
l 11 10 Il
0 | 00 00 00
1 | Il 10 11
Modulo-2 sum: 11 10 00 10 II
® docsity.com