This is one of the more formal, academic essays. The aim is to grep a level-1 theoretical sketch of analysing algorithms.
In computer science, the analysis of algorithms is the process of finding the computational complexity of algorithms – the amount of time, storage, or other resources needed to execute them.
An algorithm is a series of unambiguous instructions to solve a given problem. Imagine a problem as a point (P) in a two-dimensional Euclidean space, and its solution as another point (S) some distance apart. As you can imagine, multiple branches connect P → S, where each branch is a potential algorithm to solve the problem.
The goal of the analysis of algorithms is to compare algorithms (mainly in terms of run time but also memory, development effort, etc.). But these metrics are subject to the developer and the computer, so we need a more objective system to quantify the performance of algorithms.
An asymptote is a straight line that continually approaches a given curve but does not meet it at any finite distance.
In asymptotics (asymptotic analysis), we are interested in how the complexity of a function
f(n) grows with its input
n. As an illustration, if we have a function
$$f(n) = n^2 + 3n$$
n becomes large — the term
3n becomes insignificant compared to
n^2. Hence, the function
f(n) is said to be asymptotically equivalent ($\sim$) to
n → ∞).
The notation denotes the tight upper bound of a given function. It measures the worst case computational complexity of an algorithm. The binary relation can be defined as:
$$f(n) = O(g(n))$$
For example, if
$$f(n) = 7n^3 + 2n^2 + 20n + 8$$