$30.00
Description
Q1) Gradient Descent in 1D: Consider the function to be f(w) = ^{1}_{2} w^{2}

Perform gradient descent to find the minimum of f_{1}. For = 0:1, plot the output of the algorithm at each step. [25 Marks]

Plot the output of the algorithm for = 0:1, = 1, = 1:5, = 2, = 2:5 . [15 Marks]

Implement gradient descent with line search. [10 Marks]

Q2) Repeat previous question for a) f(x) =
1
w^{2}
5w + 3. [20 Marks]
2
b) f(x) =
1
. [10 Marks]
1+e ^{w}
Q3) Gradient Descent in 2D: Let x 2 R^{2}. Consider the functions f_{1}(w) = w(1)^{2} + w(2)^{2} + 5w(1) 3w(2) 2 and f_{2}(w) = 10w(1)^{2} + w(2)^{2}

Show the gradient and contour plots for f_{1} and f_{2} [10 Marks]

Perform gradient descent to find the minimum of f_{1} and f_{2}. [10 Marks]
The gradient descent procedure in 1dimnension is given by

dL
(1)
^{w}t+1 ^{=} ^{w}t_{dw }jw=w_{t}
The gradient in ddimension is denoted by rL, and it is a function from R^{d} ! R^{d}, i.e., at any input point in R^{d}, the gradient function output the direction of maximum change (the direction is a vector in R^{d}). Thus at input w_{0} 2 R^{d}, the gradient outputs
rL(w_{0}) = ( _{@w}^{@L}_{(1)} j_{w(1)=w}_{0}_{(1)}; _{@w}^{@L}_{(2)} j_{w(2)=w}_{0}_{(2)}; : : : ; _{@w}^{@L}_{(d)} j_{w(d)=w}_{0}_{(d)}). The gradient descent procedure in ddimension is given by
^{w}t+1 ^{=} ^{w}t 
rL(w_{t}); 
(2) 

which is same as 

@L 
(3) 

w_{t+1}(i) = w_{t}(i) 
^{j}w(i)=w_{t}(i) 

@w(i) 