Engineer bro!

Fri Apr 29 2022 (5 months ago)

Engineer by mistake!

I am a software engineer by passion

Asymptotic analysis

The asymptotic analysis defines the mathematical foundation of an algorithm's run time performance. If there is no input to an algorithm then the algorithm will always work at a constant time. Asymptotic analysis is the running time of any process or algorithm in mathematical terms.

As we have seen in analysing-the-algorithm article, there are two types of analysis:

  1. Empirical analysis - Run the Algorithm and measure how much processor time and storage are needed.

  2. Theoretical analysis - Mathematically computing how much time and space are needed as a function of input size.

We have also seen that empirical analysis is not feasible as you will always have to buy the different hardware and software then run the algorithm and record the running time and storage taken by the algorithm.

So, the process of buying new hardware, and running the algorithm using different software is a tedious task.


The running time of an algorithm depends on two factors:

  1. Input size

  2. Nature of input

Generally, the running time grows for the algorithm along with input size, for example, sorting 100 numbers will take less time than sorting 101010^{10} numbers. So, the running time of the algorithm is usually measured as the function of input size.

Instead of measuring the actual time required in executing each statement in the code, we consider how many times each statement is executed. So, the theoretical complexity of the algorithm is measured in terms of the number of steps/primitive operations performed.

Problem & Instance

Let's see what is a problem and its instance.

Instance

The instance of the problem consists of the input needed to compute the solution to the problem.

For example:

  • Problem :- Multiply two numbers 8 and 9.

  • Instance :- 8 & 9

Instance size

Any integer(generally n) that in some way measures the number of the component in an instance.

For example:

  • Sorting problem - Instance size will be the number of elements to be sorted

  • Graph problem - Instance size will be the number of nodes/edges in the graph.

A linear search or sequential search is a method for finding an element within a list. It sequentially checks each element of the list until a match is found or the whole list has been searched.

Consider the below array of numbers, you need to find 41 among them.
array of numbers

To do so, you need to go through all the positions of the array one by one. Stop the iteration once you found 41.

array  of numbers

Algorithm

function linearSearch(array,toFind){
    for(var i=0; i<array.length; i++){
        if(array[i] == toFind){
            // return the position of the found number
            return i;
        }
    }
    
    // number does not found
    return -1;
}

Cases of algorithmic complexity

There are three cases of algorithmic complexity:
Cases of algorithmic complexity

  • Best Case - Minimum comparison is required

  • Average Case - Average number of comparisons is required

  • Worst-Case - Maximum number of comparisons is required.

Best Case Average Case Worst-Case
Resource uses is minimum Resource uses is average Resource uses is maximum
Algorithm's behaviour under optimal condition Algorithm's behaviour under random condition Algorithm behaviour under worst condition
Minimum number of steps required Average number of steps required Maximum number of steps required
Lower bound on running time Average bound on running time Upper bound in running time
Generally do not occur in real Average and worst-case performance are mostly used in real life

Consider the below array of numbers.

[45,78.12,90][45,78.12,90]

The best-case for linear search on the above array will be finding the number 45. As it is at the first position of the array, so it requires a single comparison.

The average-case for linear search on the above array will be finding any number between 45 and 90.

The worst-case for linear search on the above array will be finding any 90, as it requires the maximum number of comparisons.

Array case analysis

What exactly Asymptotic analysis is?

Computing the running time of the Algorithm's operations in mathematical units of computation and defining the mathematical formula of its run time performance is referred to as Asymptotic analysis.

An algorithm may not have the same performance for different types of inputs, with the increase in the input size or nature will change the overall performance of the algorithm. As we have seen in the case of the linear search, the running time totally depends on the input given to the algorithm.

Asymptotic analysis accomplishes the study of change in performance of the algorithm with the change in the order of the input size. Using Asymptotic analysis, we can very well define the best case, average case and worst case of the algorithm.

There are three main mathematical notation is used in Asymptotic analysis:

Asymptotic Notations

These are also known as the Algorithm's growth rate, let's see them one by one

1. Big Oh notation - Upper bound

The notation O(n)O(n) is the formal way to express the upper bound of an Algorithm's running time. It measures the worst-case time complexity or the longest amount of time an algorithm can possibly take to complete.

Big Oh

O(g(n))O(g(n)) = f(n)f(n) : there exists some positive constants cc and n0n_0 such that 0<=f(n)<=cg(n)0<=f(n) <= cg(n) for all n0<=nn_0 <= n.

  • g(n)g(n) is an asymptotically upper bound for ๐‘“(๐‘›)๐‘“(๐‘›).

  • An upper bound ๐‘”(๐‘›)๐‘”(๐‘›) of an algorithm defines the maximum time required, we can always solve the problem in at most ๐‘”(๐‘›)๐‘”(๐‘›) time.

  • Time taken by a known algorithm to solve a problem with worse case input gives the upper bound.

nn f(n)=n2f(n)=n^2 g(n)=2ng(n) = 2^n
1 1 2 f(n)<g(n)f(n)<g(n)
2 4 4 f(n)=g(n)f(n)=g(n)
3 9 8 f(n)>g(n)f(n)>g(n)
4 16 16 f(n)=g(n)f(n)=g(n)
5 25 32 f(n)<g(n)f(n)<g(n)
6 36 64 f(n)<g(n)f(n)<g(n)

2. Omega notation - Lower bound

Big Omega notation (ฮฉ ) is used to define the lower bound of any algorithm or we can say the best case of any algorithm. This always indicates the minimum time required for any algorithm for all input values, therefore the best case of any algorithm. When a time complexity for any algorithm is represented in the form of big-โ„ฆ, it means that the algorithm will take at least this much time to complete it's execution. It can definitely take more time than this too.

Big omega

ฮฉ(g(n))ฮฉ(g(n)) = { f(n)f(n):There exists a positive constants cc and n0n_0 such that 0<=cg(n)<=f(n)0 <= cg(n) <= f(n) for all n0<=nn_0 <= n }

  • ๐‘”(๐‘›)๐‘”(๐‘›) is an asymptotically lower bound for ๐‘“(๐‘›)๐‘“(๐‘›).

  • ๐‘“(๐‘›)=ฮฉ(๐‘”(๐‘›))๐‘“(๐‘›)= ฮฉ(๐‘”(๐‘›)) implies: ๐’‡(๐’)โ‰ฅ๐’„.๐’ˆ(๐’)๐’‡(๐’) โ‰ฅ ๐’„.๐’ˆ(๐’)

  • A lower bound ๐‘”(๐‘›)๐‘”(๐‘›) of an algorithm defines the minimum time required, it is not possible to have any other algorithm (for the same problem) whose time complexity is less than ๐‘”(๐‘›)๐‘”(๐‘›) for random input.

nn g(n)=2ng(n) = 2^n f(n)=n2f(n) = n^2
1 2 1 f(n)>g(n)f(n) > g(n)
2 4 4 f(n)=g(n)f(n) = g(n)
3 8 9 f(n)<g(n)f(n) < g(n)
4 16 16 f(n)=g(n)f(n) = g(n)
5 32 25 f(n)>g(n)f(n) > g(n)
6 64 36 f(n)>g(n)f(n) > g(n)

3. Theta notation - same order

The notation ฮธ(n)ฮธ(n) is the formal way to enclose both the lower bound and the upper bound of an algorithm's running time. Since it represents the upper and the lower bound of the running time of an algorithm, it is used for analyzing the average case complexity of an algorithm. The time complexity represented by the Big-ฮธ notation is the range within which the actual running time of the algorithm will be. So, it defines the exact Asymptotic behaviour of an algorithm.

Theta

  • ๐œƒ(๐‘”(๐‘›)) is a set, we can write ๐‘“(๐‘›) ๐œ– ๐œƒ(๐‘”(๐‘›)) to indicate that ๐‘“(๐‘›) is a member of ๐œƒ(๐‘”(๐‘›)).

  • ๐‘”(๐‘›) is an asymptotically tight bound for ๐‘“(๐‘›).

  • ๐‘“(๐‘›)=๐œƒ(๐‘”(๐‘›)) implies: ๐’‡(๐’) = ๐’„.๐’ˆ(๐’)

Asymptotic notations revisited

O-Notation (Big O notation) (Upper Bound)

ฮŸ(๐‘”(๐‘›)) = {๐‘“(๐‘›) : there exist positive constants ๐‘ and n0n_0 such that ๐ŸŽโ‰ค๐’‡(๐’)โ‰ค๐’ˆ(๐’) for all ๐‘›0โ‰ค๐‘›}

i.e -> ๐Ÿ(๐ง) = ๐Ž(๐ (๐ง))

ฮฉ-Notation (Omega notation) (Lower Bound)

ฮฉ(๐‘”(๐‘›)) = {๐‘“(๐‘›) : there exist positive constants ๐‘ and n0n_0 such that ๐ŸŽโ‰ค๐’„๐’ˆ(๐’)โ‰ค๐’‡(๐’) for all ๐‘›_0โ‰ค๐‘›}

i.e -> ๐Ÿ(๐ง) = ฮฉ(๐ (๐ง))

ฮธ-Notation (Theta notation) (Same order)

ฮธ(๐‘”(๐‘›)) = {๐‘“(๐‘›) : there exist positive constants c1,๐‘2c_1, ๐‘_2 and n0n_0 such that ๐ŸŽโ‰ค๐œ๐Ÿ๐ (๐ง)โ‰ค๐Ÿ(๐ง)โ‰ค๐œ๐Ÿ๐ (๐ง)for all ๐‘›0โ‰ค๐‘›๐ŸŽโ‰ค๐œ_๐Ÿ ๐ (๐ง)โ‰ค๐Ÿ(๐ง)โ‰ค๐œ_๐Ÿ ๐ (๐ง) for \space all \space ๐‘›_0โ‰ค๐‘› }

i.e -> ๐Ÿ(๐ง) = ๐›‰(๐ (๐ง))

Asymptotic Notations โ€“ Examples

Example - 1

Let's suppose we have two function f(n)=n2and g(n)=nf(n) = n^2 and \space g(n) = n. By looking at the functions we can say that f(n)โ‰ฅg(n)โ€…โ€ŠโŸนโ€…โ€Šf(n)=ฮฉ(g(n))f(n) \geq g(n) \implies f(n) = \Omega(g(n)) .

Example - 2

For another example, let suppose we have two functions. f(n)=n and g(n)=n2f(n)=n \space and \space g(n) = n^2. By looking at the functions we can say that f(n)โ‰คg(n)โ€…โ€ŠโŸนโ€…โ€Šf(n)=O(g(n))f(n) \leq g(n) \implies f(n) = O(g(n))

Common orders of Magnitude

The computer science community has defined some common orders of magnitude, which are as follows along with their values.

nn logโกn\log n nlogโกnn \log n n2n^2 n3n^3 2n2^n !n!n
4 2 8 16 64 16 24
16 4 64 256 4096 65536 2.09ร—10132.09 \times 10^{13}
64 6 384 4096 262144 1.84ร—1091.84 \times 10^9 1.26ร—10291.26 \times 10^29
256 8 2048 65536 16777216 1.15ร—10771.15 \times 10^{77} โˆž\infty
1024 10 10240 1048576 1.07ร—1091.07 \times 10^9 1.79ร—103081.79 \times 10^{308} โˆž\infty
4096 12 49152 16777216 6.87ร—10106.87 \times 10^10 10123310^{1233} โˆž\infty

Conclusion

We have seen introduction of algorithm analysis and its types. We have also seen asymptotic notations along with examples, common orders of magnitude, etc.

Thanks for reading the article.

Design And Analysis of AlgorithmsData StructureSoftware Engineering

Comments (0)

Discussions (0)


ยฉ 2021 dsabyte. All rights reserved