Analyzing Network Output - Part 1, Training and Saving

Posted on jeu. 12 avril 2018 in Tutorial by Corey Adams
Tagged with MNIST, training, saving, minibatching, batch norm

I train a very simple and basic mnist classification network with a lot of overkill: I use minibatching, batch normalization, and save the network weights to disk. This tutorial can be done on a CPU. In Part 2, I restore the model, run on the validation set, and analyze the results.


Continue reading

Analyzing Network Output - Part 2, Restoring and Analyzing

Posted on jeu. 12 avril 2018 in Tutorial by Corey Adams
Tagged with MNIST, retoring, analysis, minibatching, batch norm

See Part 1 first! There, I trained an mnist classifier. Here, I restore the trained network, run the network on the validation script, and do some analysis on the output. This is meant as a template for new users to see "how do I actually use a trained network?" Lots of this information exists elsewhere too, I've just tried to consolidate the basics.


Continue reading