Analysis Of Algorithms: Lecture 3
2, we get:
:
(g). So, for example, n3 =
(n2).
n <= c for all n >= n0.But this is not possible; we can never choose a constant c large enough that n will never exceed it, since n can grow without bound. Thus, the original assumption, that n3 = O(n2), must be wrong so n3 != O(n2). Big-Oh gives us a formal way of expressing asymptotic upper bounds, a way of bounding from above the growth of a function. Knowing where a function falls within the big-Oh hierarchy allows us to compare it quickly with other functions and gives us an idea of which algorithm has the best time performance.
Lower Bounds: Omega
Another way of grouping functions, like big-Oh, is to give an asymptotic lower bound. Given a complicated function f, we find a simple function g that, within a constant multiple and for large enough n, bounds f from below. Define
: This gives us a somewhat different family of functions; now i any function f that grows strictly faster than g is in(g(n)) = { the set of all f such that there exist positive constants c and n0 satisfying 0 <= cg(n) < f(n) for all n >= n0 }.
(g). So, for example, n3 =
(n2). Tight Bounds: Theta
Neither big-Oh or Omega are completely satisfying; we would like a tight bound on how quickly our function grows. To say it doesn't grow any faster than something doesn't help us know how slowly it grows, and vice-versa. So we need something to give us a tigher bound; something that bounds a function from both above and below. We can combine big-Oh and Omega to give us a new set of functions, Theta:It is equivalent to say:(g(n)) = { the set of functions f(n) such that f(n) = O(g(n)) and f(n) =
(g(n)) } .
Whenever possible, we try to bound the running time of an algorithm from both above and below with Theta notation.(g(n)) = { the set of all f such that there exist positive constants c1, c2 and n0 satisfying 0 <= c1g(n) <= f(n) <= c2g(n) for all n >= n0 }.
Convenient Theorems
For polynomials, and indeed a large class of functions that are the sums of monotonically increasing terms, we can simplify the process of O etc. notation by noticing these rules. Below, p and q are arbitrary polynomials:- c p(n) =
(p(n)). - p(n) + q(n) =
(max (p(n), q(n)) . - nk = O((1 +
)n) for any constant k and any positive constant
(i.e., any polynomial is bounded by any exponential).
Từ khóa » T(n)=3n 3+5n2+2n+7
-
[PDF] Big-Oh Notation
-
Big-O Notation - Prove That $n^2 + 2n + 3$ Is $\mathcal O(n^2)
-
[PDF] Big-Oh Notation: Few Examples
-
[PPT] CSE 326: Lecture 2, Asymptotic Analysis - Washington
-
Big O Notation: Definition And Examples - YourBasic
-
Adding, Subtracting And Finding The Least Common Multiple
-
[PDF] Cs5002_lect9_fall18_notes.pdf
-
Complexity Of Algorithms Using Big-Oh Notation - Anne Dawson
-
Big-O Notation (article) | Algorithms - Khan Academy
-
[PDF] Data Structures 12 Asymptotic Notations.pdf - University Of Peshawar
-
Prove That 3n^2 + 2n + 100 = O(n^3) - Quora - Quora