Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Compression & Huffman Codes: Techniques & Algorithms for Data Compression - Prof. Nelson P, Study notes of Computer Science

An overview of data compression, focusing on the concepts of compression, its benefits, and the use of huffman codes. The principles of compression, its sources of compressibility, types of compression, and the effectiveness of lossless and lossy compression. It also delves into the huffman code approach, its algorithm, and its properties.

Typology: Study notes

Pre 2010

Uploaded on 07/29/2009

koofers-user-kx0-1
koofers-user-kx0-1 🇺🇸

10 documents

1 / 31

Toggle sidebar

Related documents


Partial preview of the text

Download Compression & Huffman Codes: Techniques & Algorithms for Data Compression - Prof. Nelson P and more Study notes Computer Science in PDF only on Docsity! CMSC 132: Object-Oriented Programming II 1 Compression & Huffman Codes Department of Computer Science University of Maryland, College Park Overview Compression Examples Sources Types Effectiveness Huffman Code Properties Huffman tree (encoding) Decoding 2 Sources of Compressibility Redundancy Recognize repeating patterns Exploit using Dictionary Variable length encoding Human perception Less sensitive to some information Can discard less important data 5 Types of Compression Lossless Preserves all information Exploits redundancy in data Applied to general data Some lossless audio formats (e.g., FLAC) Lossy May lose some information Exploits redundancy & human perception Applied to audio, image, video, multimedia 6 Effectiveness of Compression Metrics Bits per byte (8 bits) 2 bits / byte ⇒ ¼ original size 8 bits / byte ⇒ no compression Percentage 75% compression ⇒ ¼ original size 7 Lossless Compression Techniques LZW (Lempel-Ziv-Welch) compression Build pattern dictionary Replace patterns with index into dictionary Run length encoding Find & compress repetitive sequences Huffman code Use variable length codes based on frequency 10 Huffman Code Approach Variable length encoding of symbols Exploit statistical frequency of symbols Efficient when symbol probabilities vary widely Principle Use fewer bits to represent frequent symbols Use more bits to represent infrequent symbols A A B A 11 A AA B Huffman Code Example Symbol Dog Cat Bird Fish Frequency 1/8 1/4 1/2 00 01 10 11 110 10 0 111 3 bits 2 bits 1 bit 3 bits Huffman Encoding 2 bits 1/8 Original Encoding 2 bits 2 bits 2 bits Expected size Original ⇒ 1/8×2 + 1/4×2 + 1/2×2 + 1/8×2 = 2 bits / symbol Huffman ⇒ 1/8×3 + 1/4×2 + 1/2×1 + 1/8×3 = 1.75 bits / symbol 12 Huffman Code – Creating Tree Algorithm Place each symbol in leaf Weight of leaf = symbol frequency Select two trees L and R (initially leafs) Such that L, R have lowest frequencies in tree Create new (internal) node Left child ⇒ L Right child ⇒ R New frequency ⇒ frequency( L ) + frequency( R ) Repeat until all nodes merged into one tree 15 Huffman Tree Construction 1 16 3 5 8 2 C E 7 IHA Huffman Tree Construction 2 17 3 5 8 C E 2 7 I 5 A H Huffman Tree Construction 5 A H E = 01 I = 00 C = 10 A = 111 H = 110 3 5 8 2 75 10 15 25 1 0 C E I1 0 0 1 01 20 Huffman Coding Example Huffman code Input ACE Output (111)(10)(01) = 1111001 E = 01 I = 00 C = 10 A = 111 H = 110 21 Huffman Code Algorithm Overview Decoding Read compressed file & binary tree Use binary tree to decode file Follow path from root to leaf 22 Huffman Decoding 3 A H 1111001 A 3 5 8 2 75 10 15 25 1 0 C E I1 0 0 1 01 25 Huffman Decoding 4 A H 1111001 A 3 5 8 2 75 10 15 25 1 0 C E I1 0 0 1 01 26 Huffman Decoding 5 A H 1111001 AC 3 5 8 2 75 10 15 25 1 0 C E I1 0 0 1 01 27 Huffman Code Properties Prefix code No code is a prefix of another code Example Huffman(“dog”) ⇒ 01 Huffman(“cat”) ⇒ 011 // not legal prefix code Can stop as soon as complete code found No need for end-of-code marker Nondeterministic Multiple Huffman coding possible for same input If more than two trees with same minimal weight 30 Huffman Code Properties Greedy algorithm Chooses best local solution at each step Combines 2 trees with lowest frequency Still yields overall best solution Optimal prefix code Based on statistical frequency Better compression possible (depends on data) Using other approaches (e.g., pattern dictionary) 31
Docsity logo



Copyright © 2024 Ladybird Srl - Via Leonardo da Vinci 16, 10126, Torino, Italy - VAT 10816460017 - All rights reserved