Algorithm Analysis in design and analysis of algorithm

**Content-Definition with Example**

Algorithm Analysis

Definition of Space complexity.

Definition of Time complexity.

Definition of Input size.

Definition Asymptotic notation.

Definition of [Big "oh"].

Definition of [Omega].

Definition of [Theta].

Definition of [Little "oh"].

Here some factories in design and analysis algorithm

**A.Designing algorithm:**-

This part comprises of structuring and composing calculations for tackling issues. Designing of the algorithm is done by using different types of methods. The common methods are used-divide and conquer dynamic programming, branch, and bound backtracking, and greedy methods.

**B.Validation of algorithm**.

First, we design the required algorithm and check whether they compute the correct answer for all possible valid inputs. This part of the checking is called algorithm validation.

**C. Analysis of algorithm.**

The analysis of the algorithm is concerned with the estimation of the resources required. But of all these estimations, the total time that the algorithm required to complete implementation

of all of its instructions is taken as a measure of the performance of the algorithm.

**D. Testing a program.**

Compose a program and execute it with some example information and check whether you get a yield. Else, you make the fundamental adjustments in the program and execute it once more.

This process continues until you get output. This part is called "debugging".

**Definition of Space complexity.**

The space complexity of an algorithm p is the amount of memory it requires for running the algorithm. It has two parts, one is a fixed apart and another is a variable part.

S(p)=c+Sp(instance characteristics)

where "c" is a constant.

**Definition of Time complexity.**

The time complexity is the computational unpredictability that depicts the measure of time it takes to run a calculation. Time multifaceted nature is generally assessed by counting the number of elementary operations performed by the algorithm, supposing that each elementary operation takes a fixed amount of time to perform.

Therefore, the measure of time taken and the number of basic tasks performed by the calculation are taken to vary by all things considered a steady factor.

**Definition of Input size.**

The input size is an instance characteristic. It shows the number of words required by that instant characteristic.

**Asymptotic notation.**

Asymptotic notations are numerical apparatuses to speak to the time unpredictability of calculations for asymptotic examination.

Here the 3 asymptotic notations are mostly used to represent the time complexity of the algorithm.

**Definition of [Big "oh"]:-**

Let f(n) and g(n) be two functions. Then we say f(n) is big oh of g(n) and write f(n)=O(g(n)) if and only if there exists constants c and no, such that

f(n)<=c*g(n) for all n>=n0.

e.g- 3n+1=O(n),as 3n+1<=4n for all n>=1.

**Definition of [Omega]:-**

**F(n) is omega of g(n) and write f(n)=omega(g(n)) if and only if there exist positive constants c and no, such that**

f(n)>=c*g(n) for all n>=no

e.g- 3n+2=omega(n) as 3n+2>=3n for all n>=1.

**Definition of [Theta]:-**

**f(n) is theta of g(n) and write f(n)=Theta(g(n)) if and only if there exists positive constants c1 and c2, and no such that**

c1g(n)<=f(n)<=c2g(n) for all n>=no

e.g- 3n+3=Theta(n).

**Definition of [Little "oh"]:-**

**f(n) is little oh of g(n) and write f(n)=o(g(n)) if and only if limit of f(n)/g(n) as n-Infinity is zero.**

e.g-- 3n+2=o(n2).

## No comments:

## Post a Comment