Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Understanding Compression and Huffman Codes: A Detailed Guide, Study notes of Computer Science

An in-depth exploration of compression techniques, focusing on huffman codes. Learn about the benefits of compression, sources of compressibility, types of compression, and the effectiveness of lossless and lossy compression. Discover the principles of huffman coding and its advantages over other methods.

Typology: Study notes

Pre 2010

Uploaded on 02/13/2009

koofers-user-fh7
koofers-user-fh7 ๐Ÿ‡บ๐Ÿ‡ธ

10 documents

1 / 16

Toggle sidebar

Related documents


Partial preview of the text

Download Understanding Compression and Huffman Codes: A Detailed Guide and more Study notes Computer Science in PDF only on Docsity! 1 1 CMSC 132: Object-Oriented Programming II Compression & Huffman Codes Department of Computer Science University of Maryland, College Park 2 Overview Compression Examples Sources Types Effectiveness Huffman Code Properties Huffman tree (encoding) Decoding 2 3 Compression Definition Reduce size of data (number of bits needed to represent data) Benefits Reduce storage needed Reduce transmission cost / latency / bandwidth 4 Compression Examples Tools winzip, pkzip, compress, gzip Formats Images .jpg, .gif Audio .wav (CD), .mp3, .wma, .aac Video mpeg1 (LD,VCD), mpeg2 (DVD), mpeg4 (Divx) General .zip, .gz 5 9 Effectiveness of Compression Lossless Compression is not guaranteed Pigeonhole principle Reduce size 1 bit โ‡’ can only store ยฝ of data Example 000, 001, 010, 011, 100, 101, 110, 111 โ‡’ 00, 01, 10, 11 If compression is always possible (alternative view) Compress file (reduce size by 1 bit) Recompress output Repeat (until we can store data with 0 bits) 10 Lossless Compression Techniques LZW (Lempel-Ziv-Welch) compression Build pattern dictionary Replace patterns with index into dictionary Run length encoding Find & compress repetitive sequences Huffman code Use variable length codes based on frequency 6 11 Huffman Code Approach Variable length encoding of symbols Exploit statistical frequency of symbols Efficient when symbol probabilities vary widely Principle Use fewer bits to represent frequent symbols Use more bits to represent infrequent symbols A A B A A AA B 12 Huffman Code Example Expected size Original โ‡’ 1/8ร—2 + 1/4ร—2 + 1/2ร—2 + 1/8ร—2 = 2 bits / symbol Huffman โ‡’ 1/8ร—3 + 1/4ร—2 + 1/2ร—1 + 1/8ร—3 = 1.75 bits / symbol Symbol 3 bits1 bit2 bits3 bits 111010110Huffman Encoding 11100100 2 bits 1/2 Bird 1/81/41/8Frequency 2 bits2 bits2 bits Original Encoding FishCatDog 7 13 Huffman Code Data Structures Binary (Huffman) tree Represents Huffman code Edge โ‡’ code (0 or 1) Leaf โ‡’ symbol Path to leaf โ‡’ encoding Example A = โ€œ11โ€, H = โ€œ10โ€, C = โ€œ0โ€ Priority queue To efficiently build binary tree 1 1 0 0 A C H 14 Huffman Code Algorithm Overview Encoding Calculate frequency of symbols in file Create binary tree representing โ€œbestโ€ encoding Use binary tree to encode compressed file For each symbol, output path from root to leaf Size of encoding = length of path Save binary tree 10 19 Huffman Tree Construction 4 3 5 8 2 7 5 10 15 A C EH I 20 Huffman Tree Construction 5 3 5 8 2 75 10 15 25 1 1 1 1 0 0 0 0 A C E H I E = 01 I = 00 C = 10 A = 111 H = 110 11 21 Huffman Coding Example Huffman code Input ACE Output (111)(10)(01) = 1111001 E = 01 I = 00 C = 10 A = 111 H = 110 22 Huffman Code Algorithm Overview Decoding Read compressed file & binary tree Use binary tree to decode file Follow path from root to leaf 12 23 Huffman Decoding 1 3 5 8 2 75 10 15 25 1 1 1 1 0 0 0 0 A C E H I 1111001 24 Huffman Decoding 2 3 5 8 2 75 10 15 25 1 1 1 1 0 0 0 0 A C E H I 1111001 15 29 Huffman Decoding 7 3 5 8 2 75 10 15 25 1 1 1 1 0 0 0 0 A C E H I 1111001 ACE 30 Huffman Code Properties Prefix code No code is a prefix of another code Example Huffman(โ€œdogโ€) โ‡’ 01 Huffman(โ€œcatโ€) โ‡’ 011 // not legal prefix code Can stop as soon as complete code found No need for end-of-code marker Nondeterministic Multiple Huffman coding possible for same input If more than two trees with same minimal weight 16 31 Huffman Code Properties Greedy algorithm Chooses best local solution at each step Combines 2 trees with lowest frequency Still yields overall best solution Optimal prefix code Based on statistical frequency Better compression possible (depends on data) Using other approaches (e.g., pattern dictionary)
Docsity logo



Copyright ยฉ 2024 Ladybird Srl - Via Leonardo da Vinci 16, 10126, Torino, Italy - VAT 10816460017 - All rights reserved