Your cart is currently empty!
Note: The assignment will be autograded. It is important that you do not use additional libraries, or change the provided functions input and output. Part 1: Setup Remove connect to a EWS machine. ssh (netid)@remlnx.ews.illinois.edu Load python module, this will also load pip and virtualenv module load python/3.4.3 …
Note: The assignment will be autograded. It is important that you do not use additional libraries, or change the provided functions input and output.
Part 1: Setup
ssh (netid)@remlnx.ews.illinois.edu
module load python/3.4.3
source ~/cs446sp_2018/bin/activate
cd ~/(netid)
svn cp https://subversion.ews.illinois.edu/svn/sp18-cs446/_shared/mp6 . cd mp6
pip install -r requirements.txt
Part 2: Exercise
In this exercise, you will implement the deep net from the written section and train it using stochastic gradient descent. The deep net should be implemented from scratch in python (i.e. forward pass and back-propagation). Do not use Tensorflow, Pytorch, Chainer etc. to implement the network. Consider the number of input layers to be 2, the number of hidden layers to be 3, the number of output layers to be 1 and the batch size to be 3. The loss to
|
|
be optimized is 1 P
Em , where m is the bach size and Em is the L2 loss from the written
part, associated with the mth sample.
Hint: Think how the derivatives computed in the written part will change when considering this new loss.
1
Description of the functions to be implemented:
Relevant File: back prop.py
Part 3: Writing Tests
In test.py, we have provided basic test-cases. Feel free to write more. To test the code, run:
nose2
Part 4: Submit Submitting the code is equivalent to committing the code. This can be done with the follow command:
svn commit -m “Some meaningful comment here.”
Lastly, double check on your browser that you can see your code at
https://subversion.ews.illinois.edu/svn/sp18-cs446/(netid)/mp6/