# Olog n types of Algorithms

suggest changeLet’s say we have a problem of size n. Now for each step of our algorithm(which we need write), our original problem becomes half of its previous size(n/2).

So at each step, our problem becomes half.

Step | Problem | —— | —— | 1 | n/2 | 2 | n/4 | 3 | n/8 | 4 | n/16 |

When the problem space is reduced(i.e solved completely), it cannot be reduced any further(n becomes equal to 1) after exiting check condition.

- Let’s say at kth step or number of operations:

**problem-size** = 1

- But we know at kth step, our problem-size should be:

**problem-size** = n/2k

- From 1 and 2:

```
n/2<sup>k</sup> = 1 or
n = 2<sup>k</sup>
```

- Take log on both sides

```
log<sub>e</sub> n = k log<sub>e</sub>2
or
k = log<sub>e</sub> n / log<sub>e</sub> 2
```

- Using formula
**logx m / logx n = logn m**

k = log2 n

or simply k = log n

Now we know that our algorithm can run maximum up to log n, hence time complexity comes as

O( log n)

A very simple example in code to support above text is :

```
for(int i=1; i<=n; i=i*2)
{
// perform some operation
}
```

So now if some one asks you if n is 256 how many steps that loop( or any other algorithm that cuts down it’s problem size into half) will run you can very easily calculate.

k = log2 256

k = log2 2 8 ( => logaa = 1)

k = 8

Another very good example for similar case is **Binary Search Algorithm**.

```
int bSearch(int arr[],int size,int item){
int low=0;
int high=size-1;
while(low<=high){
mid=low+(high-low)/2;
if(arr[mid]==item)
return mid;
else if(arr[mid]<item)
low=mid+1;
else high=mid-1;
}
return –1;// Unsuccessful result
}
```