Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Big-O, Little-o, Big-Omega, Little-Omega, and Theta notations #274

Open
qingquan-li opened this issue Sep 30, 2024 · 0 comments
Open

Big-O, Little-o, Big-Omega, Little-Omega, and Theta notations #274

qingquan-li opened this issue Sep 30, 2024 · 0 comments

Comments

@qingquan-li
Copy link
Owner

1. Big-O vs Little-o

Big-O Notation O(g(n))

  • Big-O notation describes an upper bound on the growth rate of a function.

  • It tells us that the function grows no faster than g(n) for large inputs.

  • The function f(n)) is O(g(n)) if there exists a constant c > 0 and an n_0 such that for all n > n_0, the following inequality holds:

    f(n) ≤ c⋅g(n)

  • Big-O allows f(n) and g(n) to grow at the same rate asymptotically.

Big-O Notation

Example:

If f(n) = 2n^2 + 3n + 5, we say f(n) = O(n^2) because n^2 dominates the growth of the function as n becomes large.

Little-o Notation o(g(n))

  • Little-o notation gives a strict upper bound.
  • It means that f(n) grows strictly slower than g(n), and eventually becomes insignificant compared to g(n) as n→∞.
  • The function f(n) is o(g(n)) if for every constant c > 0, there exists an n_0 such that for all n >= n_0, the following inequality holds:
    f(n) < c⋅g(n)
  • Little-o does not allow f(n) and g(n) to grow at the same rate; f(n) must grow strictly slower.
Little-o Notation

Example:

If f(n) = n, then f(n) = o(n^2) because n grows strictly slower than n^2.


2. Big-Omega vs Little-Omega

Big-Omega Notation Ω(g(n))

  • Big-Omega notation provides a lower bound on the growth of a function.
  • It tells us that f(n) grows at least as fast as g(n) for large inputs.
  • The function f(n) is Ω(g(n)) if there exists a constant c > 0 and an n_0 such that for all n >= n_0, the following inequality holds:
    f(n) ≥ c⋅g(n)
  • Big-Omega allows f(n) and g(n) to grow at the same rate asymptotically.
Big-Omega Notation

Example:

If f(n) = 5n^2 + 3n, then f(n) = Ω(n^2), because n^2 is the dominating term in the lower bound.

Little-Omega Notation ω(g(n))

  • Little-Omega notation provides a strict lower bound.
  • It tells us that f(n) grows strictly faster than g(n) as n→∞.
  • The function f(n) is ω(g(n)) if for every constant c > 0, there exists an n_0 such that for all n >= n_0, the following inequality holds:
    f(n) > c⋅g(n)
  • Little-Omega does not allow f(n) and g(n) to grow at the same rate; f(n) must grow strictly faster.
Little-Omega Notation

Example:

If f(n) = n^2, then f(n) = ω(n) because n^2 grows strictly faster than n.


3. Theta (Θ) Notation

Theta (Θ) Notation Θ(g(n))

  • Theta (Θ) notation provides a tight bound on the growth of a function.
  • It tells us that f(n) grows at the same rate as g(n), both from above and below, for large inputs.
  • The function f(n) is Θ(g(n)) if there exist constants c_1, c_2 > 0 and an n_0 such that for all n >= n_0, the following inequality holds:
    c1⋅g(n) ≤ f(n) ≤ c2⋅g(n)
  • In other words, f(n) is bounded both above and below by g(n), up to constant factors, for large n.
Theta notation

Example:

If f(n) = 3n^2 + 4n, then f(n) = Θ(n^2) because n^2 dominates both the upper and lower bounds.


Practical Application in Algorithm Analysis:

  1. Big-O Notation: Used to describe the worst-case running time or space complexity of an algorithm. For example, Merge Sort has a time complexity of O(n log n).

  2. Big-Omega Notation: Often used to describe the best-case scenario, or the minimum time an algorithm will take. For example, an algorithm may have a best-case complexity of Ω(n), indicating linear performance in the best scenario.

  3. Theta Notation: If an algorithm's running time is always the same (both best and worst case), we use Θ to describe its tight complexity. For example, an algorithm with time complexity Θ(n) always takes linear time, regardless of input.

  4. Little-o and Little-Omega: These are used less frequently in practice, but they provide more precise descriptions of growth rates, especially when comparing two algorithms. For example, if one algorithm grows strictly slower than another, we can say f(n) = o(g(n)).


Summary:

  • Big-O and Big-Omega provide loose bounds (upper and lower, respectively), while Little-o and Little-Omega provide strict bounds.
  • Theta (Θ) gives an exact bound, meaning the function grows at the same rate both in the upper and lower bounds.
  • In practice, Big-O and Theta (Θ) are most commonly used to describe the time and space complexity of algorithms.

Big-O Big-Omega and Theta

all notations graph

Reference: https://www.slideshare.net/slideshow/asymptotic-notation-63320158/63320158

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant