A program takes 35 seconds for input size 20 (i.e., n=20). Ignoring the effect of constants, approximately how much time can the same program be expected to take if the input size is increased to 100 given the following run-time complexities?
a. O(N)
b. O(N + log N)
c. O(N^3)
d. O(2^N)1

Respuesta :

Given :

[tex]n_i=20\\t_i=35\ s\\n_f=100[/tex]

To Find :

Expected time for :

a. O(N)

b. O(N + log N)

c. O(N^3)

d. O(2^N)

Solution :

a.

In O(N) relation is ,

t = kn

So ,

35 = k(20)

t = k(100)

t = 175 seconds .

b .

In O( N + log N ) , N is dominating term . So the complexity is simplified in O(N) .

So , t = 175 seconds .

c .

The relation is given by :

[tex]t=kn^3[/tex]

So ,

[tex]35=k(20)^3\\\\t=k(100)^3\\\\t=35(\dfrac{100}{20})^3\\\\t=4375\ seconds[/tex]

d .

The relation is given by :

[tex]t=k(2)^n[/tex]

So ,

[tex]35=k(2)^{20}\\\\t=k(2)^{100}\\\\t=35\times (2)^{80}[/tex]

Hence , this is the required solution .