Download Notes on Asymptotic Notation - Analysis of Algorithms | ECS 222A and more Study notes Algorithms and Programming in PDF only on Docsity! Discussion 1 Dr. Nina Amenta Thursday, January 12 ECS 222A, Winter 2005 Asymptotic Notation We begin by stating a few useful definitions. 1. If f(n) = Θ(g(n)), then ∃ positive constants c1, c2, n0 such that 0 ≤ c1g(n) ≤ f(n) ≤ c2g(n), for all n ≥ n0. 2. If f(n) = O(g(n)), then ∃ positive constants c, n0 such that 0 ≤ f(n) ≤ cg(n), for all n ≥ n0. 3. If f(n) = Ω(g(n)), then ∃ positive constants c, n0 such that 0 ≤ cg(n) ≤ f(n), for all n ≥ n0. 4. If f(n) = o(g(n)), then ∃ positive constants c, n0 such that 0 ≤ f(n) < cg(n), for all n ≥ n0. 5. If f(n) = Ω(g(n)), then ∃ positive constants c, n0 such that 0 ≤ cg(n) < f(n), for all n ≥ n0. Proof by Substitution It is easy to complete a proof by substitution, and a bit harder to ascertain if you have found the best possible upper bound. For example, we will show that T(n)≤ an + T (9/10n) is O(n log n) (which is not the best bound). In order to prove this, we will apply the substitution method, which is basically induction. We will assume that the asymptotic running time bound holds for small n, assume it is true for all n ≤ n′, and then show that it is true for all n > n′. Thus, in this case, we assume T (n) ≤ cn log n. T (n) = an + T (9/10n) (1) ≤ an + c(9n/10) log (9n/10) (2) = an + c(9n/10)(log (9/10) + log n) (3) ≤ an + c(9n/10) log n (4) ≤ an log n + c(9n/10) log n (5) ≤ n log n(a + 9/10c) (6) But what we want is to show that T (n) ≤ cn log n. Thus, we want: a + 9/10c ≤ c (7) a ≤ c/10 (8) 10a ≤ c (9) Thus, we have bounded c... as long as c ≥ 10a, then T (n) = O(n log n). But this is not the sharpest upper bound. Now, using the same methodology, we show that T (n) = O(n). T (n) = an + T (9/10n) (10) ≤ an + c(9n/10) (11) n(a + 9/10c) ≤ cn (12) a + 9/10c ≤ c (13) a ≤ c/10 (14) 10a ≤ c (15) Thus, again we have bound T (n). By choosing c ≥ 10a, then T (n) = O(n). Note: when doing proofs by substitution, it is always important to remember the precise goal. For example, consider this INCORRECT PROOF that T (n) = an + 2T (n/2) is O(n). 1 T (n) = an + 2T (n/2) (16) ≤ an + 2(cn/2) (17) ≤ (a + c)n = O(n) (18) This is an absolutely true statement. (a + c)n = O(n). However, what we needed to show was that (a + c)n ≤ cn for some choice of c. This will never be true. When we assume that T (n) = O(n log n), the correct proof is as follows. T (n) = an + 2T (n/2) (19) ≤ an + 2(cn/2 log n/2) (20) = an + cn(log n− log 2) (21) = an + cn(log n− 1) (22) an + cn(log n− 1) ≤ cn log n (23) a + c(log n− 1) ≤ c log n (24) a− c + c log n ≤ c log n (25) a ≤ c (26) Thus, this equation is true for all c ≥ a. Thus T (n) = an + 2T (n/2) is O(n log n). Random Search The purpose of this example is to dervive some classic results in a very simple and clear way. In Random- Search, we always have an array of size n, and we are looking for a particular element x in that array, or an index i such that A[i] = x. a. For this analysis, let us assume that we choose and replace. In other words, we choose a number i at random from between 1 and n each time (we toss the elements into a bag and pluck them out one by 1, returning them each time). If A[i] = x, we return, else we continue to choose and replace elements. In order to analyze the expected running time of our algorithm, let Z be a random variable such that Z equals the number of indices picked before x is found. Since we allow indices to be choosen more than once, E[Z] can be calculated as follows: E[Z] = ∑ z=1 ∞ z · Pr[Z = z] (27) E[Z] = 1 · Pr[Z = 1] + 2 · Pr[Z = 2] + 3 · Pr[Z = 3] + · · · (28) E[Z] = 1 · 1 n + 2 · n− 1 n · 1 n + 3 · ( n− 1 n )2 · 1 n + · · · (29) E[Z] = 1 n (1 + 2 · n− 1 n + 3 · ( n− 1 n )2 + · · · ) (30) E[Z] = 1 n ∞∑ j=0 (j + 1) ( n− 1 n )j (31) 2