FroquizFroquiz
HomeQuizzesSenior ChallengeGet CertifiedBlogAbout
Sign InStart Quiz
Sign InStart Quiz
Froquiz

The most comprehensive quiz platform for software engineers. Test yourself with 10000+ questions and advance your career.

LinkedIn

Platform

  • Start Quizzes
  • Topics
  • Blog
  • My Profile
  • Sign In

About

  • About Us
  • Contact

Legal

  • Privacy Policy
  • Terms of Service

Β© 2026 Froquiz. All rights reserved.Built with passion for technology
Blog & Articles

What Is Big O Notation? Understanding Algorithm Complexity With Real Examples

Why does code that works perfectly with 10 elements collapse with 10 million? Big O Notation is the mathematical language for predicting how your code will behave at scale.

Yusuf SeyitoğluMarch 5, 20263 views10 min read

What Is Big O Notation? Understanding Algorithm Complexity With Real Examples

You write a function that works perfectly with 10 elements. Tests pass, code review approved, shipped to production. Three months later, when the data reaches 10 million, the system stops responding.

This scenario is real. And it happens repeatedly to developers who don't understand Big O.

What Is Big O Notation?

Big O Notation is the mathematical notation used to describe how an algorithm behaves β€” in terms of time and memory β€” as the input size grows. It tells you how many resources an algorithm will consume in the worst case.

What matters: not the absolute time, but the growth rate. Whether a function runs in 10ms or 20ms depends on the environment. But when data doubles, does the time double, quadruple, or grow logarithmically β€” that is universal.

O(1) β€” Constant Time

No matter how large the input, the operation completes in the same amount of time.

javascript
// Accessing the first element of an array - always O(1) function getFirst(arr) { return arr[0]; } // Reading a value from an object - always O(1) function getUser(users, id) { return users[id]; // hash map lookup } getFirst([1]); // 1ms getFirst([1, 2, 3, ..., 10000000]); // 1ms (identical)

Array index access, hash map reads, stack push/pop β€” these are O(1). No matter how much data grows, the time doesn't change.

O(log n) β€” Logarithmic Time

The problem is halved at every step. When data doubles, only one step is added.

javascript
// Binary Search - finding an element in a sorted array function binarySearch(arr, target) { let left = 0; let right = arr.length - 1; while (left <= right) { const mid = Math.floor((left + right) / 2); if (arr[mid] === target) return mid; if (arr[mid] < target) left = mid + 1; else right = mid - 1; } return -1; }

Maximum 10 steps in an array of 1,000 elements. Maximum 20 steps in 1,000,000. 30 steps in 1,000,000,000. Even if data grows a billion times, only 30 steps were added.

Binary search, balanced binary tree lookup, heap operations β€” all O(log n).

O(n) β€” Linear Time

Each element is processed once. When data doubles, time doubles.

javascript
// Searching an array - in the worst case, every element is checked function linearSearch(arr, target) { for (let i = 0; i < arr.length; i++) { if (arr[i] === target) return i; } return -1; } // Sum calculation function sum(arr) { return arr.reduce((total, num) => total + num, 0); }

1,000 steps for 1,000 elements. 1,000,000 steps for 1,000,000. Linear growth.

O(n log n) β€” Linearithmic Time

Most efficient sorting algorithms live here. Worse than O(n) but much better than O(nΒ²).

javascript
// Merge Sort - O(n log n) function mergeSort(arr) { if (arr.length <= 1) return arr; const mid = Math.floor(arr.length / 2); const left = mergeSort(arr.slice(0, mid)); const right = mergeSort(arr.slice(mid)); return merge(left, right); } function merge(left, right) { const result = []; let i = 0, j = 0; while (i < left.length && j < right.length) { if (left[i] <= right[j]) result.push(left[i++]); else result.push(right[j++]); } return [...result, ...left.slice(i), ...right.slice(j)]; }

JavaScript's Array.sort(), Python's sorted() β€” both O(n log n).

O(nΒ²) β€” Quadratic Time

Two nested loops. When data doubles, time quadruples.

javascript
// Bubble Sort - O(nΒ²) function bubbleSort(arr) { for (let i = 0; i < arr.length; i++) { for (let j = 0; j < arr.length - i - 1; j++) { if (arr[j] > arr[j + 1]) { [arr[j], arr[j + 1]] = [arr[j + 1], arr[j]]; } } } return arr; } // Classic mistake: finding duplicates with nested loops function hasDuplicates(arr) { for (let i = 0; i < arr.length; i++) { for (let j = i + 1; j < arr.length; j++) { if (arr[i] === arr[j]) return true; // O(nΒ²) } } return false; } // The right approach: using Set for O(n) function hasDuplicatesFast(arr) { return new Set(arr).size !== arr.length; // O(n) }

1,000,000 operations for 1,000 elements. 100,000,000 for 10,000. This is why production systems crash.

O(2ⁿ) β€” Exponential Time

The number of operations doubles at every step. Becomes unusable even with small inputs.

javascript
// Fibonacci - naive recursive (O(2^n)) function fibonacci(n) { if (n <= 1) return n; return fibonacci(n - 1) + fibonacci(n - 2); // each call creates two branches } fibonacci(10) // 177 calls fibonacci(20) // 21,891 calls fibonacci(40) // 331,160,281 calls fibonacci(50) // trillions of calls β†’ system freezes

Reduced to O(n) with memoization:

javascript
// Fibonacci - O(n) with memoization function fibonacci(n, memo = {}) { if (n in memo) return memo[n]; if (n <= 1) return n; memo[n] = fibonacci(n - 1, memo) + fibonacci(n - 2, memo); return memo[n]; }

The Comparison: Seeing It With Numbers

How dramatic the difference becomes as n grows:

code
n = 10: O(1) β†’ 1 operation O(log n) β†’ 3 operations O(n) β†’ 10 operations O(n log n) β†’ 33 operations O(nΒ²) β†’ 100 operations n = 1,000: O(1) β†’ 1 operation O(log n) β†’ 10 operations O(n) β†’ 1,000 operations O(n log n) β†’ 10,000 operations O(nΒ²) β†’ 1,000,000 operations n = 1,000,000: O(1) β†’ 1 operation O(log n) β†’ 20 operations O(n) β†’ 1,000,000 operations O(n log n) β†’ 20,000,000 operations O(nΒ²) β†’ 1,000,000,000,000 operations ← minutes/hours

Space Complexity: Memory, Not Time

Big O applies to memory usage as well as time.

javascript
// O(1) space - constant memory function sum(arr) { let total = 0; // a single variable for (const num of arr) total += num; return total; } // O(n) space - memory proportional to input function double(arr) { return arr.map(x => x * 2); // creates a new array } // Recursive functions occupy space on the call stack function factorial(n) { if (n <= 1) return 1; return n * factorial(n - 1); // O(n) space β€” stack n levels deep }

Practical Rule: When Does It Matter?

You don't need to calculate the Big O of every piece of code. It matters for: functions that operate on large data sets, functions called inside loops, database queries (the N+1 problem), and API endpoint response times.

O(nΒ²) is acceptable for small, fixed-size data. For growing data, aim for O(n log n) or better.

When writing code, ask: "How does this behave with data 1000 times larger?" Knowing the answer is your best insurance against production surprises.

About Author

Yusuf Seyitoğlu

Author β†’

Other Posts

  • GraphQL Schema Design: Types, Resolvers, Mutations and Best PracticesMar 12
  • Java Collections Deep Dive: ArrayList, HashMap, TreeMap, LinkedHashMap and When to Use EachMar 12
  • CSS Advanced Techniques: Custom Properties, Container Queries, Grid Masonry and Modern LayoutsMar 12
All Blogs