Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Lecture Notes: CSE 331, Summer 2002, Week 6 - Huffman Codes, Divide & Conquer, Quizzes of Computer Science

These are the lecture notes for cse 331, summer 2002, week 6. The notes cover the topics of huffman codes using greedy algorithms, divide & conquer algorithms, and dynamic programming. Huffman codes are used to represent characters with bit strings of sizes commensurate with their frequencies of use in a file. The lecture also covers the advantages and disadvantages of greedy algorithms and the examples of mergesort, quicksort, binary search, three-way mergesort, and maximum subsequence sum using divide & conquer. Lastly, the lecture discusses the concept of dynamic programming and its examples, including fibonacci and the all-pairs shortest path problem using the floyd-warshall algorithm.

Typology: Quizzes

Pre 2010

Uploaded on 07/22/2009

koofers-user-9uf-1
koofers-user-9uf-1 🇺🇸

10 documents

1 / 3

Toggle sidebar

Related documents


Partial preview of the text

Download Lecture Notes: CSE 331, Summer 2002, Week 6 - Huffman Codes, Divide & Conquer and more Quizzes Computer Science in PDF only on Docsity! CSE 331, Summer 2002, Lecture #15 (6/19/02) 1. Quiz 2. * Greedy Example: Huffman Codes (a) Problem: would like to represent each member a set of characters with a bit string that is of size (in bits) commensurate with its frequency of use in a given file. (b) Huffman codes create from character use frequencies, an optimal prefix code tree (trie). (c) Algorithm takes in a list of characters and their frequencies as a forest, and repeatedly joins the two smallest trees into a larger tree (creating a new node each time) until there is only one tree left in the forest. (d) 1 huffman( frequency_list l ) 2 { 3 priority_queue q; 4 q.insert_list( l ); 5 for ( i=1; i<l.size; i++ ) 6 { 7 n = new huffman_tree_node; 8 n.left = q.delete_min(); 9 n.right = q.delete_min(); 10 n.frequency = n.left.frequency + n.right.frequency; 11 q.insert( n ); 12 } 13 return q.delete_min(); 14 } (e) Example. 3. Greedy conclusion (a) Pros: easy to come up with; small running time; may be optimal (b) Cons: not always optimal; may not even find a solution for some problems 4. Divide & Conquer (a) Divide: continually divide the problem into smaller subproblems until we reach a basis case. (b) Conquer: each of the subproblems and combine their solutions to find the larger solution. (c) Studied Examples: Mergesort, Quicksort (d) Divide and Conquer algorithms iterate over their input, dividing it and reducing it by some fraction each time; canonical examples are Quicksort, Mergesort, Binary Search, ... (e) Analysis makes frequent use of recurrence relations – often of the form usable in master theorem: T (n) = aT (n/b) + f(n). (f) Thus, at each iteration, problem is broken into a subproblems each of size n/b; each iteration requiring time f(n). (g) D&C Example: Binary Search i. Work with a sorted list: at each iteration we divide the list in two and look in one. ii. At each iteration we break the problem up into 1 subproblem of size n/2 and we require O(1) running time at each iteration. (h) D&C Example: Three-way Mergesort i. Instead of merging two lists, we merge three. ii. Divide the data set into three chunks of equal size at each iteration. Upon return, we merge the three lists and return the result. iii. In other words, at each iteration, we break the problem up into 3 subproblems, each of size n/3 and we require O(n) operations at each recursion level to do this. (i) D&C Example: Maximum Subsequence Sum i. Divide the list into two pieces each time until we reach the basis case (have but one value left to return) and return; now, at each level, the maximum subsequence sum can occur on the the left side of our recursion, the right side of our recursion, or between the two. So, we check from the middle out, find the greatest total we can there, and then compare that to the left and the right, returning the left and right endpoints of the maximum subsequence sum we have found. ii. At each iteration, we break the problem up into 2 subproblems, each of size n/2 and require O(N) operations at each level. (j) D&C Conclusion i. Good method to try when methods for solving smaller instances of a problem can be made similar to those that solve larger ones. 5. Dynamic Programming (a) Studied Examples: Fibonacci recursion lab. (b) If we think of recursion as being a “top down” computational tree traversal, dynamic programming is a sort of “bottom-up” cousin that uses tables to avoid performing repeated computations. (c) Essentially involves recursion and sub-solution memoization (for later use). (d) DP Example: Fibonacci. i. We can use a table (in this case, an array) to store previously computed values that will be needed in at the next level up in the tree. ii. We can write our recurrence relation: D[0] = D[1] = 1, D[j] = D[j − 1]+ D[j − 2] and use it. iii. 1 f[0] = f[1] = 1; 2 3 fib( int n ) 4 { 5 int i; 6 for ( i=2; i<=n; i++ ) 7 f[i] = f[i-1] + f[i-2]; 8 return f[n]; 9 } (e) DP Example: All-Pairs Shortest Path Problem – Floyd-Warshall Algorithm i. Problem: given a weighted digraph G(V, E), find the cost of the shortest path from every vertex to every vertex. ii. One property of shortest paths sets up the algorithm: let N = |V |; arbitrarily number the vertices {1 . . .N}; now consider a subset of the vertices – those numbered {1 . . . k}; consider all paths between vertices i, j ∈ V that use only vertices from this set; let p be the shortest of all of them. Two cases arise: either k is on path p or it is not. If it is not, then p only involves vertices from the set {1 . . . k − 1}. If k is on the path between i and j, then there exist two segments of p: i ; k, and k ; j – the path of either is drawn from the set {1 . . . k − 1}. iii. So what? Since we can make a decision on whether or not k’s addition is a good thing for each pair of vertices (each path), we can effectively examine longer and longer paths by bringing in a new k at each iteration. We can use this to form a recurrence relation to represent this: d (k) ij = wij if k = 0 and min(d (k−1) ij , d (k−1) ik + d (k−1) kj ) if k > 0.
Docsity logo



Copyright © 2024 Ladybird Srl - Via Leonardo da Vinci 16, 10126, Torino, Italy - VAT 10816460017 - All rights reserved