Analyzing Network Output - Part 1, Training and Saving

Posted on Thu 12 April 2018 in Tutorial by Corey Adams
Tagged with MNIST, training, saving, minibatching, batch norm

I train a very simple and basic mnist classification network with a lot of overkill: I use minibatching, batch normalization, and save the network weights to disk. This tutorial can be done on a CPU. In Part 2, I restore the model, run on the validation set, and analyze the results.


Continue reading