Download Understanding Compression and Huffman Codes: A Detailed Guide and more Study notes Computer Science in PDF only on Docsity! 1 1 CMSC 132: Object-Oriented Programming II Compression & Huffman Codes Department of Computer Science University of Maryland, College Park 2 Overview Compression Examples Sources Types Effectiveness Huffman Code Properties Huffman tree (encoding) Decoding 3 Compression Definition Reduce size of data (number of bits needed to represent data) Benefits Reduce storage needed Reduce transmission cost / latency / bandwidth 4 Compression Examples Formats General .zip, .rar Images .jpg, .gif Audio .mp3, .wmv Video .mpg, .mov 5 Sources of Compressibility Redundancy Recognize repeating patterns Exploit using Dictionary Variable length encoding Human perception Less sensitive to some information Can discard less important data 6 Types of Compression Lossless Preserves all information Exploits redundancy in data Applied to general data Some lossless audio formats (e.g., FLAC) Lossy May lose some information Exploits redundancy & human perception Applied to audio, image, video, multimedia 2 7 Effectiveness of Compression Metrics Bits per byte (8 bits) 2 bits / byte ⇒ ¼ original size 8 bits / byte ⇒ no compression Percentage 75% compression ⇒ ¼ original size 8 Effectiveness of Compression Depends on data Random data ⇒ hard Example: 1001110100 ⇒ ? Organized data ⇒ easy Example: 1111111111 ⇒ 1×10 Corollary No universally best compression algorithm 9 Effectiveness of Compression Lossless Compression is not guaranteed Pigeonhole principle Reduce size 1 bit ⇒ can only store ½ of data Example 000, 001, 010, 011, 100, 101, 110, 111 ⇒ 00, 01, 10, 11 If compression is always possible (alternative view) Compress file (reduce size by 1 bit) Recompress output Repeat (until we can store data with 0 bits) 10 Lossless Compression Techniques LZW (Lempel-Ziv-Welch) compression Build pattern dictionary Replace patterns with index into dictionary Run length encoding Find & compress repetitive sequences Huffman code Use variable length codes based on frequency 11 Huffman Code Approach Variable length encoding of symbols Exploit statistical frequency of symbols Efficient when symbol probabilities vary widely Principle Use fewer bits to represent frequent symbols Use more bits to represent infrequent symbols A A B A A AA B 12 Huffman Code Example Expected size Original ⇒ 1/8×2 + 1/4×2 + 1/2×2 + 1/8×2 = 2 bits / symbol Huffman ⇒ 1/8×3 + 1/4×2 + 1/2×1 + 1/8×3 = 1.75 bits / symbol Symbol 3 bits1 bit2 bits3 bits 111010110Huffman Encoding 11100100 2 bits 1/2 Bird 1/81/41/8Frequency 2 bits2 bits2 bits Original Encoding FishCatDog 5 25 Huffman Decoding 3 3 5 8 2 75 10 15 25 1 1 1 1 0 0 0 0 A C E H I 1111001 A 26 Huffman Decoding 4 3 5 8 2 75 10 15 25 1 1 1 1 0 0 0 0 A C E H I 1111001 A 27 Huffman Decoding 5 3 5 8 2 75 10 15 25 1 1 1 1 0 0 0 0 A C E H I 1111001 AC 28 Huffman Decoding 6 3 5 8 2 75 10 15 25 1 1 1 1 0 0 0 0 A C E H I 1111001 AC 29 Huffman Decoding 7 3 5 8 2 75 10 15 25 1 1 1 1 0 0 0 0 A C E H I 1111001 ACE 30 Huffman Code Properties Prefix code No code is a prefix of another code Example Huffman(“dog”) ⇒ 01 Huffman(“cat”) ⇒ 011 // not legal prefix code Can stop as soon as complete code found No need for end-of-code marker Nondeterministic Multiple Huffman coding possible for same input If more than two trees with same minimal weight 6 31 Huffman Code Properties Greedy algorithm Chooses best local solution at each step Combines 2 trees with lowest frequency Still yields overall best solution Optimal prefix code Based on statistical frequency Better compression possible (depends on data) Using other approaches (e.g., pattern dictionary)