All articles
Fundamentals·Mar 5, 2025·9 min read

Big O Notation Explained Simply

Stop guessing the performance of your code. Learn how to analyze time and space complexity with real examples.

Why Big O Matters

When you write code, you're making a bet. You're betting that your solution will be fast enough — for 100 items, for 100,000 items, for 10 million items. Big O notation is how you stop guessing and start knowing.

It's the single most important concept for coding interviews, and once you internalize it, you start seeing the performance implications of every line you write.

What Big O Actually Measures

Big O describes how the runtime (or memory usage) of an algorithm scales with the size of the input, usually called n. It doesn't tell you exact execution time — it tells you the shape of growth.

Think of it this way: if you double the input size, does your algorithm take twice as long? Four times as long? The same amount of time? That's what Big O captures.

We always describe the worst case unless stated otherwise. An algorithm that's usually fast but occasionally terrible is still a risky algorithm.

The Complexities You Need to Know

O(1) — Constant Time

The algorithm takes the same amount of time regardless of input size.

  • Accessing an array element by index
  • Inserting into a hash map
  • Checking if a stack is empty

O(log n) — Logarithmic

Each step cuts the remaining work in half. This is the magic of divide-and-conquer.

  • Binary search on a sorted array
  • Inserting into a balanced BST
  • Finding a number in a sorted phone book

If n = 1,000,000, log2(n) is only about 20. That's incredible efficiency.

O(n) — Linear

You touch each element once. Scales proportionally with input.

  • Iterating through an array
  • Finding the maximum value
  • Linear search

O(n log n) — Linearithmic

The sweet spot for sorting. You can't sort a general list faster than this (proven lower bound).

  • Merge sort, heapsort, quicksort (average)
  • Most efficient sorting algorithms

O(n2) — Quadratic

Two nested loops over the same data. Fine for n < 1000, painful for n > 10,000.

  • Bubble sort, insertion sort
  • Checking all pairs in an array
  • Naive string matching

O(2n) — Exponential

Doubles with every additional element. Only acceptable for tiny inputs.

  • Recursive Fibonacci without memoization
  • Generating all subsets of a set
  • Brute force password cracking

O(n!) — Factorial

The worst practical complexity. n = 15 already means billions of operations.

  • Generating all permutations naively
  • Brute force traveling salesman

How to Analyze Code

Follow these rules to analyze any piece of code:

Rule 1: Drop constants.

O(2n) is just O(n). O(500) is O(1). Constants don't change the shape of growth.

Rule 2: Drop lower-order terms.

O(n2 + n) is O(n2). When n is large, n2 dominates completely.

Rule 3: Sequential steps add.

Two separate O(n) loops = O(n + n) = O(2n) = O(n). Still linear.

Rule 4: Nested steps multiply.

A loop inside a loop = O(n x n) = O(n2).

Rule 5: Recursion — think about the call tree.

A function that calls itself twice per call with input n/2 has a call tree of depth log n with 2 branches — giving O(n) total work.

Space Complexity: The Other Dimension

Time isn't the only resource. Space (memory) matters too. An algorithm that creates a new array to store results uses O(n) space. One that sorts in-place uses O(1) space.

In interviews, you'll often face trade-offs: use a hash map to go from O(n2) time to O(n) time, but accept O(n) space. Understanding this trade-off is what separates good answers from great ones.

Common space complexities:

  • O(1): Two pointers, swap variables in-place
  • O(n): Storing results in an array, recursion stack depth
  • O(n2): 2D DP table, adjacency matrix for a dense graph

Practical Examples

Let's say you need to find all pairs in an array that sum to a target.

Brute force approach: Check every pair — two nested loops. O(n2) time, O(1) space.

Hash map approach: For each number, check if the complement is in the map. One pass. O(n) time, O(n) space.

Same problem, completely different performance at scale. At n = 10,000, the brute force does 100 million operations. The hash map does 10,000.

The Interview Habit

In every interview, after writing your solution, always say: "This solution is O(n) time and O(1) space." Then pause and ask: "Can we do better?"

That one habit signals that you think like an engineer, not just a programmer. It's one of the most reliable ways to impress your interviewer.

Ready to put this into practice?

Practice on CodingBuddyAI →