Analyzing Network Output - Part 1, Training and Saving

Posted on jeu. 12 avril 2018 in Tutorial by Corey Adams
Tagged with MNIST, training, saving, minibatching, batch norm

I train a very simple and basic mnist classification network with a lot of overkill: I use minibatching, batch normalization, and save the network weights to disk. This tutorial can be done on a CPU. In Part 2, I restore the model, run on the validation set, and analyze the results.


Continue reading

Analyzing Network Output - Part 2, Restoring and Analyzing

Posted on jeu. 12 avril 2018 in Tutorial by Corey Adams
Tagged with MNIST, retoring, analysis, minibatching, batch norm

See Part 1 first! There, I trained an mnist classifier. Here, I restore the trained network, run the network on the validation script, and do some analysis on the output. This is meant as a template for new users to see "how do I actually use a trained network?" Lots of this information exists elsewhere too, I've just tried to consolidate the basics.


Continue reading

How to Implement Minibatching in Tensorflow

Posted on ven. 22 d├ęcembre 2017 in Minibatch by Ji Won Park
Tagged with MNIST, minibatch, tensorflow

A demonstration of minibatch implementation in Tensorflow, comparing training with and without minibatches with the MNIST dataset as an example


Continue reading

Generative Adversarial Networks: The Basics

Posted on mar. 19 d├ęcembre 2017 in Generative Adversarial Network by Corey Adams
Tagged with MNIST, GAN, generative, adversarial

An introduction to generative adversarial networks, covering the bare basics of how to build and train a GAN on mnist data.


Continue reading