The essay is a buildup on the theoretical setup from the previous. I'm going for variance over mean here — you'll see multiple types of examples with minimal concept overlap. It is a more hands-on essay with examples (not something you can go through in parallel with your ongoing Zoom call).

The examples are snippets written in Python, they're not self-contained, and you might have to tweak some endpoints (declare/define some variables, etc.) to execute them.

Practically, we only care about the worst-case complexity of the algorithms (which makes sense if you ponder a little). Hence, we'll only be calculating Big-O notation [upper bound of the fn] here.

Alright, with sweeping out of the way:

## Examples

#1

# executes n times - {1}
for i in range(n):
# executes m times - {2}
for j in range(m):
t += i * j # executes in constant time - {3}


$$\begin{split} O(n)&=\{1\}\{2\}\{3\}\\&=nmc\\\therefore O(n)&=n*m \end{split}$$

#2

# executes n times - {1}
for i in range(n):
# executes n times at max case - {2}
for j in range(i):
t += i * j # executes in constant time - {3}


$$\begin{split} O(n)&=\{1\}\{2\}\{3\}\\&=nnc\\\therefore O(n)&=n^2 \end{split}$$

#3

''' i doubles at every iteration (1,2,4,8,...).
Hence, the loop will execute until 2^i <= n or i <= log(n) [base 2].
Worst case, it will run **log(n)** times - {1} '''
for i in range(1, n+1):
i = i * 2 # executes in constant time - {2}


$$$$\begin{split} O(n)&=\{1\}*\{2\}\\&=log(n)*c\\\therefore O(n)&=log(n) \end{split}$$$$

#4

# executes in constant time - {1}
x = 2 * x

# executes n times - {2}
for i in range(1, n+1):
k += k * i # executes in constant time - {3}

# executes n times - {4}
for i in range(1, n+1):
# executes n times - {5}
for j in range(1, n+1):
... # some constant op - {6}


$$\begin{split} O(n)&=\{1\}+[\{2\}\{3\}]+[\{4\}\{5\}\{6\}]\\&=c1+[nc2]+[nnc3]\\&=c1+[nc2]+[n^2c3]\\ \therefore O(n)&=n^2 \end{split}$$

#5

# test: executes in constant time - {1}
if x % 2 == 0:
print('Even') # executes in constant time - {2}

# test: executes in constant time - {3}
else:
# executes n times - {4}
for i in range(n):
# test: executes in constant time - {5}
if i > 10:
x = x * i # executes in constant time - {6}
# test: executes in constant time - {7}
else:
continue # executes in constant time - {8}


$$\begin{split} O(n)&=\{1\}\{2\}+[\{3\}\{4\}[[\{5\}\{6\}]+[\{7\}\{8\}]]]\\&=c1c2+[c3n[[c4c5]+[c6c7]]]\\&=c1+[c2nc3]\ (consolidating\ constants)\\&=c1+n*c2\ (consolidating\ constants)\\ \therefore O(n)&=n \end{split}$$

#6

def fibonacci(n):
# test: executes in constant time - {1}
if n <= 1:
# executes in constant time - {2}
return n
# {3}
return fibonacci(n-1) + fibonacci(n-2)

''' {3}
Let's unroll the fn using an example value *n = 3*:
fib(3) -> fib(2) + fib(1)
fib(2) -> fib(1) + fib(0)
fib(1) -> 1
fib(0) -> 0

As you can see, this would create a binary tree (graph below).
Essentially, the execution time of the function is the same as no. of leaf nodes,
i.e. in the order of 2^n
'''


$$\begin{split} O(n)&=c1c2\text{[no. of leaf nodes}c]\\&=c12^n*c\\\therefore O(n)&=2^n \end{split}$$

Below is the execution graph for fibonacci(6)

You might have noticed that the no. of leaf nodes are nowhere near to 2^n, as stated above (kudos if you did notice that)! Theoretically, the tight upper bound of the fibonacci function is ~O(1.6^n) (the golden ratio!) because the depth of the binary tree is not balanced, and it gets uneven near the edges. However, our answer is still workable (and used theoretically), although at higher values on n the function would approach the order of O(1.6^n).