Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Operations and Properties of Asymptotic Notation, Summaries of Design and Analysis of Algorithms

How to interpret formulas with asymptotic notation and the properties of asymptotic notations. It also discusses the best, average, and worst cases of time complexities. examples to help understand the concepts.

Typology: Summaries

2021/2022

Available from 06/16/2022

lochansaroy47
lochansaroy47 🇮🇳

4 documents

1 / 3

Toggle sidebar

Related documents


Partial preview of the text

Download Operations and Properties of Asymptotic Notation and more Summaries Design and Analysis of Algorithms in PDF only on Docsity! Text: 1. Operations and properties of asymptotic notation We have already seen how asymptotic notation can be used within mathematical formulas. For example, in introducing O-notation, we wrote "n = O(n2)." We might also write 2n2 + 3n + 1 = 2n2 + Θ(n). How do we interpret such formulas? When the asymptotic notation stands alone on the right-hand side of an equation, as in n = O(n2), we have already defined the equal sign to mean set membership: n ∈ O(n2). In general, however, when asymptotic notation appears in a formula, we interpret it as standing for some anonymous function that we do not care to name. For example, the formula 2n2 + 3n + 1 = 2n2 + Θ(n) means that 2n2 + 3n + 1 = 2n2 + f(n), where f(n) is some function in the set Θ(n). In this case, f(n) = 3n + 1, which indeed is in Θ(n). Using asymptotic notation in this manner can help eliminate inessential detail and clutter in an equation. For example, the worst-case running time of merge sort as the recurrence can be written as T(n) = 2T(n/2) + Θ(n). If we are interested only in the asymptotic behaviour of T(n), there is no point in specifying all the lower-order terms exactly; they are all understood to be included in the anonymous function denoted by the term Θ(n). The number of anonymous functions in an expression is understood to be equal to the number of times the asymptotic notation appears. For example, in the expression ∑ 𝑂(𝑖)𝑛 𝑖=1 , there is only a single anonymous function (a function of i). This expression is thus not the same as O(1) + O(2) + …+ O(n), which doesn't really have a clean interpretation. In some cases, asymptotic notation appears on the left-hand side of an equation, as in 2n2 + Θ(n) = Θ(n2). We interpret such equations using the following rule: No matter how the anonymous functions are chosen on the left of the equal sign, there is a way to choose the anonymous functions on the right of the equal sign to make the equation valid. Thus, the meaning of our example is that for any function f(n) Θ(n), there is some function g(n) Θ(n2) such that 2n2 + f(n) = g(n) for all n. In other words, the right- hand side of an equation provides coarser level of detail than the left-hand side. A number of such relationships can be chained together, as in 2n2 + 3n + 1 = 2n2 + Θ(n) = Θ(n2). We can interpret each equation separately by the rule above. The first equation says that there is some function f(n)∈Θ(n) such that 2n2 + 3n + 1 = 2n2 + f(n) for all n. The second equation says that for any function g(n)∈Θ(n) (such as the f(n) just mentioned), there is some function h(n)∈Θ(n2) such that 2n2 + g(n) = h(n) for all n. Note that this interpretation implies that 2n2 + 3n + 1 = Θ(n2), which is what the chaining of equations intuitively gives us. 1.1. Properties of Asymptotic Notations: To understand the concept of asymptotic notations fully, let us look at some of the general properties of the asymptotic notations. 1.1.1. General: If f(n) = O (g(n)), then a × f(n) = O(g(n)) e.g. f(n) = 2n2+5 is O(n2), then 4f(n) = 8n2+20 is also O(n2) 1.1.2. Reflexive: Given f(n), then f(n) = O(f(n)) e.g. f(n)= n2 = O(n2) i.e. every function is an upper-bound of itself. 1.1.3. Transitive: If f(n) = O (g(n)) and g(n) = O (h(n)), then f(n) = O (h(n)) e.g. f(n) = n , g(n) = n2, & h(n)=n3 here, f(n)=O(n2) & g(n) = O(n3) => f(n) = O(n3) or O(h(n)) 1.1.4. Symmetric: If f(n) = Θ(g(n)), then g(n) = Θ(f(n)) e.g. f(n) = n2 & g(n) = n2+3 f(n)= Θ(n2) & g(n)= Θ(n2) 1.1.5. Transpose Symmetric: i. If f(n) = O(g(n)), then g(n) = Ω(f(n)) e.g. f(n)= n & g(n) = n2 Here, f(n) = O(n2) & g(n) = Ω(n) ii. If f(n) = O(g(n)) & d(n) = O(e(n)), then f(n) + d(n) = O(max( g(n),e(n) ) ) e.g. f(n) = n = O(n) d(n) = n2 = O(n2) f(n) + d(n) = n + n2 = O(n2) iii. if f(n) = O(g(n)) & d(n) = O(e(n)), then f(n) × d(n) = O(g(n) × e(n)) 1.2. Best, Average, and Worst Cases: Often times, we confuse the best, average and worst time complexities with the asymptotic notations. However, these are not always equal to O, Ω, & Θ. Let us try to understand the difference with the help of an example.  Worst-case: The maximum number of steps taken on any instance of size a.  Best-case: The minimum number of steps taken on any instance of size a.
Docsity logo



Copyright © 2024 Ladybird Srl - Via Leonardo da Vinci 16, 10126, Torino, Italy - VAT 10816460017 - All rights reserved