Algorithm Analysis

Resources:

  • Data Structures and Algorithms, OO Design Patterns in C++ by Bruno R. Preiss
  • Goodrich et al. Data Structures and Algorithms in Java

What is Algorithm?

  • Step by Step instructions to solve a specific problem, perform calculation or complete a task.

Asymptotic Notation

In computer science, Asymptotic Notation is a mathematical language used to describe the efficiency of an algorithm as the input size (usually called $n$) grows toward infinity.

It allows us to ignore hardware-specific details (like processor speed) and focus purely on how the time or space requirements of a script scale.

The Three Main Notations

We typically use three symbols to “bound” the growth of an algorithm.

Big O Notation (O): The Upper Bound

This is the most common notation. it represents the worst-case scenario. It tells you that the algorithm will never take longer than this amount of time.

  • Example: If an algorithm is $O(n^2)$, it means as the input grows, the time taken will grow no faster than the square of the input.
  • Common Examples with Code:
    • $O(1)$ – Constant Time
      • Accessing a specific element in an array such as value = data[1];
    • $O(n)$ – Linear Time
      • A simple loop searching for a value
    • $O(n^2)$ – Quadratic Time
      • Comparing every time in a list to every other item

Imagine you are moving and need to send a box of items to a friend.

  • $O(1)$ – Constant Time: You drive the box to your friend’s house. It doesn’t matter if the box contains 1 item or 100 items; the drive time remains the same.
  • $O(n)$ – Linear Time: You decide to carry each item one by one. If there are 10 items, you make 10 trips. If there are 1,000 items, you make 1,000 trips. The time grows exactly as the number of items grows.

Big Omega Notation (Ω): The Lower Bound

This represents the best-case scenario. It tells you that the algorithm will take at least this much time.

  • Example: If an algorithm is $Ω(n)$, it will perform at least n operations, even in the best circumstances.

Imagine you are driving to work.

  • Big O (O): In the absolute worst traffic, it will take you no more than 60 minutes.
  • Big Omega (Ω): Even if every traffic light is green and there are no other cars on the road, it will take you at least 15 minutes because of the physical distance.

The 15 minutes is your Ω (lower bound).

In the below code, the best case happens if the very first item in the list (data[1]) is the number 7.

  • The loop runs exactly once.
  • Regardless of whether the list has 10 items or 10 million items, the best case is always 1 operation.
  • Therefore, the Big Omega is Ω(1) (Constant Time).

Big Theta Notation (Θ): The Tight Bound

This is used when the upper and lower bounds are the same. It describes the exact growth rate of the algorithm.

  • Example: If an algorithm is $Θ(nlogn)$, it means it behaves consistently regardless of the input distribution.

Imagine you have a rule that you must read every single page of a 300-page book to write a report.

  • Best Case: Even if the book is boring, you read 300 pages.
  • Worst Case: Even if the book is exciting, you read 300 pages.
  • Θ (Tight Bound): Your effort is exactly proportional to the number of pages (n). It is Θ(n).

Code Example: Printing a List
Unlike a search (where you might find the item early), some tasks always take the same amount of time for a given input size n.

Leave a Reply

Your email address will not be published. Required fields are marked *

error: