Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Asymptotic Analysis of Algorithms: Worst, Best, and Average Case Analysis, Cheat Sheet of Design

An in-depth analysis of algorithms, focusing on running time analysis, input size, types of analysis (worst, best, and average case), and comparing algorithms using asymptotic notation (o, ω, θ). The document also covers big-o notation, visualizing orders of growth, and common orders of magnitude. It is a valuable resource for students studying computer science, particularly those in a data structures or algorithms course.

Typology: Cheat Sheet

2023/2024

Uploaded on 03/04/2024

prince-army
prince-army 🇵🇰

1 document

1 / 37

Toggle sidebar

Related documents


Partial preview of the text

Download Asymptotic Analysis of Algorithms: Worst, Best, and Average Case Analysis and more Cheat Sheet Design in PDF only on Docsity! Analysis of Algorithms CS 477/677 Asymptotic Analysis Instructor: George Bebis (Chapter 3, Appendix A) 2 Analysis of Algorithms • An algorithm is a finite set of precise instructions for performing a computation or for solving a problem. • What is the goal of analysis of algorithms? – To compare algorithms mainly in terms of running time but also in terms of other factors (e.g., memory requirements, programmer's effort etc.) • What do we mean by running time analysis? – Determine how running time increases as the size of the problem increases. 5 How do we compare algorithms? • We need to define a number of objective measures. (1) Compare execution times? Not good: times are specific to a particular computer !! (2) Count the number of statements executed? Not good: number of statements vary with the programming language as well as the style of the individual programmer. 6 Ideal Solution • Express running time as a function of the input size n (i.e., f(n)). • Compare different functions corresponding to running times. • Such an analysis is independent of machine time, programming style, etc. 7 Example • Associate a "cost" with each statement. • Find the "total cost“ by finding the total number of times each statement is executed.  Algorithm 1 Algorithm 2 Cost Cost arr[0] = 0; c1 for(i=0; i<N; i++) c2 arr[1] = 0; c1 arr[i] = 0; c1 arr[2] = 0; c1 ... ... arr[N-1] = 0; c1  ----------- ------------- c1+c1+...+c1 = c1 x N (N+1) x c2 + N x c1 = (c2 + c1) x N + c2 10 Rate of Growth • Consider the example of buying elephants and goldfish: Cost: cost_of_elephants + cost_of_goldfish Cost ~ cost_of_elephants (approximation) • The low order terms in a function are relatively insignificant for large n n4 + 100n2 + 10n + 50 ~ n4 i.e., we say that n4 + 100n2 + 10n + 50 and n4 have the same rate of growth 11 Asymptotic Notation • O notation: asymptotic “less than”: – f(n)=O(g(n)) implies: f(n) “≤” g(n)   notation: asymptotic “greater than”: – f(n)=  (g(n)) implies: f(n) “≥” g(n)   notation: asymptotic “equality”: – f(n)=  (g(n)) implies: f(n) “=” g(n) 12 Big-O Notation • We say fA(n)=30n+8 is order n, or O (n) It is, at most, roughly proportional to n. • fB(n)=n2+1 is order n2, or O(n2). It is, at most, roughly proportional to n2. • In general, any O(n2) function is faster- growing than any O(n) function. 15 Back to Our Example Algorithm 1 Algorithm 2 Cost Cost arr[0] = 0; c1 for(i=0; i<N; i++) c2 arr[1] = 0; c1 arr[i] = 0; c1 arr[2] = 0; c1 ... arr[N-1] = 0; c1  ----------- ------------- c1+c1+...+c1 = c1 x N (N+1) x c2 + N x c1 = (c2 + c1) x N + c2 • Both algorithms are of the same order: O(N) 16 Example (cont’d) Algorithm 3 Cost   sum = 0; c1 for(i=0; i<N; i++) c2 for(j=0; j<N; j++) c2 sum += arr[i][j]; c3 ------------ c1 + c2 x (N+1) + c2 x N x (N+1) + c3 x N2 = O(N2) Asymptotic notations * O-notation O(g(n)) = {f(m): there exist positive constants c and mg such that O< f(a) < cg(n) for alla => wo}. eg(n) ay g(7) is an asymptotic upper bound tor f (1). 17 20 More Examples • Show that 30n+8 is O(n). – Show c,n0: 30n+8  cn, n>n0 . • Let c=31, n0=8. Assume n>n0=8. Then cn = 31n = 30n + n > 30n+8, so 30n+8 < cn. 21 • Note 30n+8 isn’t less than n anywhere (n>0). • It isn’t even less than 31n everywhere. • But it is less than 31n everywhere to the right of n=8. n>n0=8  Big-O example, graphically Increasing n  V al ue o f f un ct io n  n 30n+8 cn = 31n 30n+8 O(n) 22 No Uniqueness • There is no unique set of values for n0 and c in proving the asymptotic bounds • Prove that 100n + 5 = O(n2) – 100n + 5 ≤ 100n + n = 101n ≤ 101n2 for all n ≥ 5 n0 = 5 and c = 101 is a solution – 100n + 5 ≤ 100n + 5n = 105n ≤ 105n2 for all n ≥ 1 n0 = 1 and c = 105 is also a solution Must find SOME constants c and n0 that satisfy the asymptotic notation relation 25 Asymptotic notations (cont.)  -notation (g(n)) is the set of functions with the same order of growth as g(n) 26 Examples – n2/2 –n/2 = (n2) • ½ n2 - ½ n ≤ ½ n2 n ≥ 0  c2= ½ • ½ n2 - ½ n ≥ ½ n2 - ½ n * ½ n ( n ≥ 2 ) = ¼ n2  c1= ¼ – n ≠ (n2): c1 n2 ≤ n ≤ c2 n2  only holds for: n ≤ 1/c1 27 Examples – 6n3 ≠ (n2): c1 n2 ≤ 6n3 ≤ c2 n2  only holds for: n ≤ c2 /6 – n ≠ (logn): c1 logn ≤ n ≤ c2 logn  c2 ≥ n/logn,  n≥ n0 – impossible Common orders of magnitude Table 1.4 Execution times for algorithms with the given time complexities a fib=lge =n f@)=alen fla) ar Fla) = 9° F{n) = 2° lo 0.003. pes* re (033 pes On ps 1s Les 20 0.004 pes 0.02 ps OARS pos O4 ps Bus tmst 30 OLO0S pes CL03 ps 0.147 yas 0.9 pe 27 ps ls 40 OL005 pes C04 ps 0.213 us L6 es 64 ps 14.3 mir 50 O00 pes O05 ps 0.282 pus 2.5 es 125 jes 13 days LF OL007 pes O10 ps O.664 jus 10 os 1 ms 4 10" years Ww L010 ps 100 ps OGG ys | ms Is lor O03 as 0) as 130 ps 1) ms 16.7 min i O07 us 0.10 ms 1.47 ms 10s 11.6 days 1? 0.020 as 1 ms 19.03 ms 16.7 min 31.7 years 1 0.023 yas 0.01 s 0.23 5 1.16 days 3/709 years lar 0.027 us O.D0s 2.06 5 113.7 days 317% years 10° 0.030 ys Ls 29.001 5 31.7 years *] us LOC second. "Dims = 107% second, 30 31 Logarithms and properties • In algorithm analysis we often use the notation “log n” without specifying the base nn nn elogln loglg 2   yxlogBinary logarithm Natural logarithm )lg(lglglg )(lglg nn nn kk   xy log xylog yx loglog   y xlog yx loglog  logb x  abx logxba log log log a a x b 32 More Examples • For each of the following pairs of functions, either f(n) is O(g(n)), f(n) is Ω(g(n)), or f(n) = Θ(g(n)). Determine which relationship is correct. – f(n) = log n2; g(n) = log n + 5 – f(n) = n; g(n) = log n2 – f(n) = log log n; g(n) = log n – f(n) = n; g(n) = log2 n – f(n) = n log n + n; g(n) = log n – f(n) = 10; g(n) = log 10 – f(n) = 2n; g(n) = 10n2 – f(n) = 2n; g(n) = 3n f(n) =  (g(n)) f(n) = (g(n)) f(n) = O(g(n)) f(n) = (g(n)) f(n) = (g(n)) f(n) = (g(n)) f(n) = (g(n)) f(n) = O(g(n)) 35 Common Summations • Arithmetic series: • Geometric series: – Special case: |x| < 1: • Harmonic series: • Other important formulas: 2 )1( nn   n k nk 1 ...21  1 1 11    x x xn   n n k k xxxx ...1 2 0 x1 1   0k kx nln   n k nk1 1... 2 111   n k k 1 lg nn lg 1 1 1   pn p   n k pppp nk 1 ...21 36 Mathematical Induction • A powerful, rigorous technique for proving that a statement S(n) is true for every natural number n, no matter how large. • Proof: – Basis step: prove that the statement is true for n = 1 – Inductive step: assume that S(n) is true and prove that S(n+1) is true for all n ≥ 1 • Find case n “within” case n+1 37 Example • Prove that: 2n + 1 ≤ 2n for all n ≥ 3 • Basis step: – n = 3: 2  3 + 1 ≤ 23  7 ≤ 8 TRUE • Inductive step: – Assume inequality is true for n, and prove it for (n+1): 2n + 1 ≤ 2n must prove: 2(n + 1) + 1 ≤ 2n+1 2(n + 1) + 1 = (2n + 1 ) + 2 ≤ 2n + 2 ≤  2n + 2n = 2n+1, since 2 ≤ 2n for n ≥ 1
Docsity logo



Copyright © 2024 Ladybird Srl - Via Leonardo da Vinci 16, 10126, Torino, Italy - VAT 10816460017 - All rights reserved