Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Lecture Slides on Compression and Huffman Codes | CMSC 132, Study notes of Computer Science

Material Type: Notes; Class: OBJECT-ORIENTED PROG II; Subject: Computer Science; University: University of Maryland; Term: Spring 2005;

Typology: Study notes

Pre 2010

Uploaded on 07/30/2009

koofers-user-16e
koofers-user-16e ๐Ÿ‡บ๐Ÿ‡ธ

10 documents

1 / 15

Toggle sidebar

Related documents


Partial preview of the text

Download Lecture Slides on Compression and Huffman Codes | CMSC 132 and more Study notes Computer Science in PDF only on Docsity! 1 Compression & Huffman Codes Fawzi Emad Chau-Wen Tseng Department of Computer Science University of Maryland, College Park Compression Definition Reduce size of data (number of bits needed to represent data) Benefits Reduce storage needed Reduce transmission cost / latency / bandwidth 2 Compression Examples Tools winzip, pkzip, compress, gzip Formats Images .jpg, .gif Audio .mp3, .wav Video mpeg1 (VCD), mpeg2 (DVD), mpeg4 (Divx) General .zip, .gz Sources of Compressibility Redundancy Recognize repeating patterns Exploit using Dictionary Variable length encoding Human perception Less sensitive to some information Can discard less important data 5 Lossless Compression Techniques LZW (Lempel-Ziv-Welch) compression Build pattern dictionary Replace patterns with index into dictionary Burrows-Wheeler transform Block sort data to improve compression Run length encoding Find & compress repetitive sequences Huffman code Use variable length codes based on frequency Huffman Code Approach Variable length encoding of symbols Exploit statistical frequency of symbols Efficient when symbol probabilities vary widely Principle Use fewer bits to represent frequent symbols Use more bits to represent infrequent symbols A A B A A AA B 6 Huffman Code Example Expected size Original โ‡’ 1/8ร—2 + 1/4ร—2 + 1/2ร—2 + 1/8ร—2 = 2 bits / symbol Huffman โ‡’ 1/8ร—3 + 1/4ร—2 + 1/2ร—1 + 1/8ร—3 = 1.75 bits / symbol Symbol 3 bits1 bit2 bits3 bits 111010110Huffman Encoding 11100100 2 bits 1/2 Bird 1/81/41/8Frequency 2 bits2 bits2 bits Original Encoding FishCatDog Huffman Code Data Structures Binary (Huffman) tree Represents Huffman code Edge โ‡’ code (0 or 1) Leaf โ‡’ symbol Path to leaf โ‡’ encoding Example A = โ€œ11โ€, H = โ€œ10โ€, C = โ€œ0โ€ Priority queue To efficiently build binary tree 1 1 0 0 A C H 7 Huffman Code Algorithm Overview Encoding 1. Calculate frequency of symbols in file 2. Create binary tree representing โ€œbestโ€ encoding 3. Use binary tree to encode compressed file For each symbol, output path from root to leaf Size of encoding = length of path 4. Save binary tree Huffman Code โ€“ Creating Tree Algorithm 1. Place each symbol in leaf Weight of leaf = symbol frequency 2. Select two trees L and R (initially leafs) Such that L, R have lowest frequencies in tree 3. Create new (internal) node Left child โ‡’ L Right child โ‡’ R New frequency โ‡’ frequency( L ) + frequency( R ) 4. Repeat until all nodes merged into one tree 10 Huffman Tree Construction 5 3 5 8 2 75 10 15 25 1 1 1 1 0 0 0 0 A C E H I E = 01 I = 00 C = 10 A = 111 H = 110 Huffman Coding Example Huffman code Input ACE Output (111)(10)(01) = 1111001 E = 01 I = 00 C = 10 A = 111 H = 110 11 Huffman Code Algorithm Overview Decoding 1. Read compressed file & binary tree 2. Use binary tree to decode file Follow path from root to leaf Huffman Decoding 1 3 5 8 2 75 10 15 25 1 1 1 1 0 0 0 0 A C E H I 1111001 12 Huffman Decoding 2 3 5 8 2 75 10 15 25 1 1 1 1 0 0 0 0 A C E H I 1111001 Huffman Decoding 3 3 5 8 2 75 10 15 25 1 1 1 1 0 0 0 0 A C E H I 1111001 A
Docsity logo



Copyright ยฉ 2024 Ladybird Srl - Via Leonardo da Vinci 16, 10126, Torino, Italy - VAT 10816460017 - All rights reserved