Grok all the things

grok (v): to understand (something) intuitively.

Algorithmic Complexity

ðŸ™„ Â Cynics & grumps

For those eager to delve into the endless abyss of calculating efficiency, judging algorithms by their worst-case performance, and adding an extra layer of self-loathing while writing your code... Welcome!

Let's cut to the chase. Algorithmic complexity is a way of measuring... Well, the complexity of an algorithm. A necessary evil if you will, designed to show how much time or space an algorithm will consume given an input size. In other words, it's our futile attempt at trying to make sense of the chaos we create with our code.

If you've been around, you may have heard of notations like Big O, Big Î©, and Big Î˜. And if you're anything like me, you might have struggled to remember which one means which without furrowing your brows.

But let me grace you with the knowledge:

• Big O (O): The "optimistic" one. Shows the upper bound of an algorithm's growth rate. Sure, things are bad, but this will tell you how bad they can get.
• Big Î© (Î©): The "pessimistic" notation. Or the more honest one, if you ask me. It tells you the lower bound of an algorithm's growth rate. Because things can't always be rainbows and butterflies.
• Big Î˜ (Î˜): The "balanced" notation that gives you both the upper and lower bounds. You know, for when you want to see the truth between all that optimism and pessimism.

Anyway, let's dive into an example that might help put sense into these abstract concepts. Let's say we have to sort a list of numbers using the infamous Bubble Sort.

Here's the Bubble Sort code for your viewing (dis)pleasure:

``````def bubble_sort(numbers):
n = len(numbers)
for i in range(n):
for j in range(0, n - i - 1):
if numbers[j] > numbers[j + 1]:
numbers[j], numbers[j + 1] = numbers[j + 1], numbers[j]
``````

It's an algorithm everyone loves to hate. The complexity of this gem is as follows: