You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Big-O notation describes an upper bound on the growth rate of a function.
It tells us that the function grows no faster than g(n) for large inputs.
The function f(n)) is O(g(n)) if there exists a constant c > 0 and an n_0 such that for all n > n_0, the following inequality holds:
f(n) ≤ c⋅g(n)
Big-O allows f(n) and g(n) to grow at the same rate asymptotically.
Example:
If f(n) = 2n^2 + 3n + 5, we say f(n) = O(n^2) because n^2 dominates the growth of the function as n becomes large.
Little-o Notation o(g(n))
Little-o notation gives a strict upper bound.
It means that f(n) grows strictly slower than g(n), and eventually becomes insignificant compared to g(n) as n→∞.
The function f(n) is o(g(n)) if for every constant c > 0, there exists an n_0 such that for all n >= n_0, the following inequality holds: f(n) < c⋅g(n)
Little-o does not allow f(n) and g(n) to grow at the same rate; f(n) must grow strictly slower.
Example:
If f(n) = n, then f(n) = o(n^2) because n grows strictly slower than n^2.
2. Big-Omega vs Little-Omega
Big-Omega Notation Ω(g(n))
Big-Omega notation provides a lower bound on the growth of a function.
It tells us that f(n) grows at least as fast as g(n) for large inputs.
The function f(n) is Ω(g(n)) if there exists a constant c > 0 and an n_0 such that for all n >= n_0, the following inequality holds: f(n) ≥ c⋅g(n)
Big-Omega allows f(n) and g(n) to grow at the same rate asymptotically.
Example:
If f(n) = 5n^2 + 3n, then f(n) = Ω(n^2), because n^2 is the dominating term in the lower bound.
Little-Omega Notation ω(g(n))
Little-Omega notation provides a strict lower bound.
It tells us that f(n) grows strictly faster than g(n) as n→∞.
The function f(n) is ω(g(n)) if for every constant c > 0, there exists an n_0 such that for all n >= n_0, the following inequality holds: f(n) > c⋅g(n)
Little-Omega does not allow f(n) and g(n) to grow at the same rate; f(n) must grow strictly faster.
Example:
If f(n) = n^2, then f(n) = ω(n) because n^2 grows strictly faster than n.
3. Theta (Θ) Notation
Theta (Θ) Notation Θ(g(n))
Theta (Θ) notation provides a tight bound on the growth of a function.
It tells us that f(n) grows at the same rate as g(n), both from above and below, for large inputs.
The function f(n) is Θ(g(n)) if there exist constants c_1, c_2 > 0 and an n_0 such that for all n >= n_0, the following inequality holds: c1⋅g(n) ≤ f(n) ≤ c2⋅g(n)
In other words, f(n) is bounded both above and below by g(n), up to constant factors, for large n.
Example:
If f(n) = 3n^2 + 4n, then f(n) = Θ(n^2) because n^2 dominates both the upper and lower bounds.
Practical Application in Algorithm Analysis:
Big-O Notation: Used to describe the worst-case running time or space complexity of an algorithm. For example, Merge Sort has a time complexity of O(n log n).
Big-Omega Notation: Often used to describe the best-case scenario, or the minimum time an algorithm will take. For example, an algorithm may have a best-case complexity of Ω(n), indicating linear performance in the best scenario.
Theta Notation: If an algorithm's running time is always the same (both best and worst case), we use Θ to describe its tight complexity. For example, an algorithm with time complexity Θ(n) always takes linear time, regardless of input.
Little-o and Little-Omega: These are used less frequently in practice, but they provide more precise descriptions of growth rates, especially when comparing two algorithms. For example, if one algorithm grows strictly slower than another, we can say f(n) = o(g(n)).
Summary:
Big-O and Big-Omega provide loose bounds (upper and lower, respectively), while Little-o and Little-Omega provide strict bounds.
Theta (Θ) gives an exact bound, meaning the function grows at the same rate both in the upper and lower bounds.
In practice, Big-O and Theta (Θ) are most commonly used to describe the time and space complexity of algorithms.
1. Big-O vs Little-o
Big-O Notation O(g(n))
Big-O notation describes an upper bound on the growth rate of a function.
It tells us that the function grows no faster than g(n) for large inputs.
The function f(n)) is O(g(n)) if there exists a constant c > 0 and an n_0 such that for all n > n_0, the following inequality holds:
f(n) ≤ c⋅g(n)
Big-O allows f(n) and g(n) to grow at the same rate asymptotically.
Example:
If f(n) = 2n^2 + 3n + 5, we say f(n) = O(n^2) because n^2 dominates the growth of the function as n becomes large.
Little-o Notation o(g(n))
f(n) < c⋅g(n)
Example:
If f(n) = n, then f(n) = o(n^2) because n grows strictly slower than n^2.
2. Big-Omega vs Little-Omega
Big-Omega Notation Ω(g(n))
f(n) ≥ c⋅g(n)
Example:
If f(n) = 5n^2 + 3n, then f(n) = Ω(n^2), because n^2 is the dominating term in the lower bound.
Little-Omega Notation ω(g(n))
f(n) > c⋅g(n)
Example:
If f(n) = n^2, then f(n) = ω(n) because n^2 grows strictly faster than n.
3. Theta (Θ) Notation
Theta (Θ) Notation Θ(g(n))
c1⋅g(n) ≤ f(n) ≤ c2⋅g(n)
Example:
If f(n) = 3n^2 + 4n, then f(n) = Θ(n^2) because n^2 dominates both the upper and lower bounds.
Practical Application in Algorithm Analysis:
Big-O Notation: Used to describe the worst-case running time or space complexity of an algorithm. For example, Merge Sort has a time complexity of O(n log n).
Big-Omega Notation: Often used to describe the best-case scenario, or the minimum time an algorithm will take. For example, an algorithm may have a best-case complexity of Ω(n), indicating linear performance in the best scenario.
Theta Notation: If an algorithm's running time is always the same (both best and worst case), we use Θ to describe its tight complexity. For example, an algorithm with time complexity Θ(n) always takes linear time, regardless of input.
Little-o and Little-Omega: These are used less frequently in practice, but they provide more precise descriptions of growth rates, especially when comparing two algorithms. For example, if one algorithm grows strictly slower than another, we can say f(n) = o(g(n)).
Summary:
Reference: https://www.slideshare.net/slideshow/asymptotic-notation-63320158/63320158
The text was updated successfully, but these errors were encountered: