\$30.00

Category:

## Description

1. Master theorem (6 points)

Use the master theorem to nd tight asymptotic bounds for the following recurrences. If the master theorem cannot be applied, state the reason and give an upper bound (big-Oh) which is as tight as you can justify. Justify your answers (by showing which case of the master theorem applies, and why.)

p

(1) T (n) = 9T (n=3) + n

Case 1 :

pn = n1 2 O(nlog3 9 )

2

= n1 2 O(n2 ) for = 1

2

1

= n2 2 O(n)

 1 C n for C = 1 and n0 = 1 so T (n) = (n2) n2
1. T (n) = 2T (n=2) + n log n Case 2 :

n log n = (nlog2 2 logk n) for k = 1 0

• (n log n)

T (n) = (n logk+1 n) for k = 1 ) T (n) = (n log2 n)

1. T (n) = 36T (n=6) + n2 Case 2 :

n2 = (nlog6 36 logk n) for k = 00

• (n2)

T (n) = (n2 logk+1 n) for k = 0 ) T (n) = (n2 log n)

1. T (n) = T (2n=5) + n Case 3 :

n = (nlog2:5 1+ ) for = 1

• (n1)

• (n)

n C n for C = 1 and n0 = 1 Regularity : 1 25n C n for C = 25 < 1 Both conditions are satis ed so T (n) = (n)

1. T (n) = T (2n=3) + T (n=3) + n

The two subproblems created in the recurrence are of di erence sizes so

the master theorem does not apply here.

With a recursion tree, the largest height is log1:5 n and a cost of n at each level. So a reasonable bound is O(n log n)

1. T (n) = T (n 2) + n

b is not > 1 so master theorem does not apply here. Using substitution and T (1) = d0:

Stops when n 2x = 1 where T(n) recurses x = n2 1 times. So roughly has a runtime of T (n) = d0 + (n2 1 )n = O(n2)

1. Recursive Program (5 points)

Consider the following recursive function for n 0:

Algorithm 1 int recFunc(int n)

//Base Case:

if n <= 2 then

return n;

end if

//Recursive Case:

i = 0;

while i < 100 do

print(\Hello!”);

i = i + 1;

end while

int a = 2*recFunc(n=2); //Integer division

j = 0

while j < a do

print(\Bye!”);

• = j + 1;

end while return a;

1. Use strong induction to prove the value returned by recFunc(n) is n. BaseCases : (0 n 4)

For n = 0; 1; or 2 the function immediately returns n so trivially rec-

Func(n) returns a value n. Inductivestep :

Assume: recFunc(k) returns a value k for all 0 k < n.

Prove: recFunc(n) returns a value n.

If n is odd then n = 2m + 1 for some positive integer m < n.

So recFunc(n) returns 2 recFunc(2m2+1 ) = 2 recFunc(m) because of inte-ger division. Since m < n, under the inductive hypothesis, recFunc(n) n.

Similarly, if n is even n = 2m for some positive integer m < n.

So recFunc(n) returns 2 recFunc(22m ) = 2 recFunc(m). Since m < n, under the inductive hypothesis, recFunc(n) n.

This covers all cases of n 0, thus by strong induction recFunc(n) returns a value n.

(2) Set up a runtime recurrence for the runtime T (n) of this algorithm.

 T (n) = 8 1 : 0 n 2 > < 3T (n=2) + 100 : n > 2 >

:

1. Solve this runtime recurrence using the master theorem. Case1 :

100 = O(nlog2 3 ) for = 1

• O(n1:585 1)

• O(n0:585)

100 C n0:585 for C = 100 and n0 = 1, so T (n) = (nlog2 3)

3. Big-Oh Induction (4 points)

Use strong induction to prove that T (n) 2 O(log n) where T (1) = 90 and T (n) = T (n=2) + 10 (for n 2).

Show T (n) C log n for n 2.

Base Case (n = 2):

LHS: T (2) = T (2=2) + 10 = T (1) + 10 = 90 + 10 = 100

RHS: C log 2 = C 1

Choose C = 100 ) T (2) = 100 100 1

Inductive Step:

Assume: T (k) C log k for 2 k < n

Prove: T (n) C log n

T (n) = T (n=2) + 10 C log n=2 + 10

C[log n log 2] + 10

C log n C + 10 Choose C 10 make extra terms 0 C log n

Choosing C = 10 and n0 = 2 completes the inductive step, so T (n) 2 O(log n)

4. Recursive Algorithm and Analysis (6 points)

Suppose company X has hired you as a software engineer.

Company X is desperate to improve the run time of their software. Rather than allow you to use the standard binary search algorithm, company X demands you create a new version. Since recursively dividing the array into two parts is so e cient for searching, the bigwigs at company X conclude dividing the array into three parts will be even more e cient.

This new \3-ary” search divides the given sorted array into 3 equal sized sorted subarrays. It nds which subarray will contain the given value and then recursively searches that subarray. If the given value is in the array you should return true. Otherwise you should return false.

1. (2 points) Write up this new recursive algorithm using pseudocode. (Hint: For simplicity you can assume the number of elements in the given sorted array, n, is a power of 3)

Algorithm 2 bool threeSearch(int A[1 : : : n], int val)

//Assuming integer division

if n == 1 then

if A[1] == val then

return true;

else

return false;

end if

else if n == 2 then

if A[1] == val or A[2] == val then

return true;

else

return false;

end if

else if val <= A[n=3] then

return threeSearch(A[1 : : : n=3], val);

else if val <= A[2 n=3] then

return threeSearch(A[n=3 + 1 : : : 2 n=3], val); else

return threeSearch(A[2 n=3 + 1 : : : n], val); end if

1. (1 point) Create a recurrence to represent the worst case run time of our new \3-ary” search from the pseudocode.

(Hint: The worst case run time occurs when the given value is not in the array)

T (n) = T (n=3) + 1

1. (2 points) Use master theorem to solve your above recurrence relation. Case2 :

f(n) = 1 = (nlog3 1 logk n) for k = 0

• (n0)

• (1)

Clearly, 1 = (1), so T (n) = (n0 logk+1 n) for k = 0, so T (n) = (log n)

1. (1 point) Compare this asymptotic run time to that of binary search. Does our new algorithm perform signi cantly better than binary search?

No, both algorithms actually perform asymptotically the same, only dif-fering by constants. Binary search is (log2 n) while the \3-ary” search is (log3 n). Their di erence is only by a constant factor of a logarithm which is insigni cant in asymptotic analysis.

1. (0 points – Just for fun) Try comparing the constants of binary search

and \3-ary” search. You can use the change of base formula for log to convert both to log2 n.