Profiling Tensorflow

Posted on mar. 25 septembre 2018 in Tutorial by Corey Adams
Tagged with Tensorflow, profiling, memory, comparisons

A walkthrough of many of the current techniques for profiling time and memory usage for tensorflow.


Continue reading

Analyzing Network Output - Part 1, Training and Saving

Posted on jeu. 12 avril 2018 in Tutorial by Corey Adams
Tagged with MNIST, training, saving, minibatching, batch norm

I train a very simple and basic mnist classification network with a lot of overkill: I use minibatching, batch normalization, and save the network weights to disk. This tutorial can be done on a CPU. In Part 2, I restore the model, run on the validation set, and analyze the results.


Continue reading

Analyzing Network Output - Part 2, Restoring and Analyzing

Posted on jeu. 12 avril 2018 in Tutorial by Corey Adams
Tagged with MNIST, retoring, analysis, minibatching, batch norm

See Part 1 first! There, I trained an mnist classifier. Here, I restore the trained network, run the network on the validation script, and do some analysis on the output. This is meant as a template for new users to see "how do I actually use a trained network?" Lots of this information exists elsewhere too, I've just tried to consolidate the basics.


Continue reading

Generative Adversarial Networks: The Basics

Posted on mar. 19 d├ęcembre 2017 in Generative Adversarial Network by Corey Adams
Tagged with MNIST, GAN, generative, adversarial

An introduction to generative adversarial networks, covering the bare basics of how to build and train a GAN on mnist data.


Continue reading