Reference:
- Data Structure and Algorithms with OOD patterns in C++ by Bruno R. Preiss, Page 34
Key Concept:
Suppose, we have two algorithms A and B, then how do we decide which algorithm is better?

Upper Bound – Big Oh
What is upper bound for a function f(n)?

The formal definition can be
Formal Definition: Big O Notation
For a function \( f(n) \) that is non-negative for all integers \( n \ge 0 \):
If there exists:
- • An integer \( n_0 \)
- • A constant \( c > 0 \)
Such that for all \( n > n_0 \):
Then, we write:
Note: This confirms that \( g(n) \) is an upper bound for \( f(n) \). Beyond the point \( n_0 \), the algorithm will never perform worse than the scaled complexity function.
Example of proving a function O(n).
Practical Example: Proving O(n)
The Function: f(n) = 3n + 8
Goal: Find a constant c and an integer n₀ such that:
Step 1: Choose a Constant
Let c = 4. This choice is strategic because 4 is greater than the coefficient of our highest-power term (3).
Step 2: Solve the Inequality
8 ≤ 4n – 3n
8 ≤ n
Conclusion:
Since the inequality holds whenever n ≥ 8, we can confirm that f(n) = O(n) using c = 4 and n₀ = 8.

Another example, proving both $O(n^2)$ and $O(n)$.
Calculation Steps for O(n²)
1. The Inequality: We are proving that a linear function is bounded by a quadratic one.
2. Pick Constant c: Let c = 1.
3. Algebraic Solution:
(n – 16)(n + 8) ≥ 0
Target: n ≥ 16
4. Conclusion: The condition holds true for the constant c = 1 and the threshold n₀ = 16.
Calculation Steps for \( O(n) \)
1. The Inequality: We must satisfy \( f(n) \le c \cdot g(n) \).
2. Pick Constant \( c \): Let \( c = 9 \).
3. Algebraic Solution:
\( 128 \le 9n – 8n \)
\( 128 \le n \)
4. Conclusion: The condition holds for \( c = 9 \) and \( n_0 = 128 \).
Properties of Big Oh
Properties of Combined Functions
When dealing with two separate functions, f₁(n) and f₂(n), each with its own upper bound, we use specific rules to determine the complexity of their combination. These properties are the foundation of asymptotic analysis in DSA.
1. The Sum Rule (Sequential)
Applies when you have two sequential operations (e.g., one function runs, then the next starts). Complexity is determined by whichever grows fastest.
Logic: As n grows very large, the smaller term becomes mathematically insignificant compared to the larger one.
Example: If f₁(n) = O(n) and f₂(n) = O(n²), then f₁(n) + f₂(n) = O(n²).
2. The Product Rule (Nested)
Applies when operations are nested (e.g., for every iteration of the first function, the entire second function runs).
Logic: The total number of operations is the product of the individual complexities.
Example: If f₁(n) = O(n) and f₂(n) = O(log n), then f₁(n) \cdot f₂(n) = O(n \log n).