How to Implement Minibatching in Tensorflow

Posted on Fri 22 December 2017 in misc • Tagged with MNIST, minibatch

In this notebook, I show how to implement "minibatches", a method which stores gradients over several chunks of data ("mini" batch) and apply them altogether. To keep this exercise concise, I use the MNIST image set and a very simple neural network as an example. I will compare how the results differ when the network is trained with and without minibatches, for several popular optimizers.

Continue reading

Generative Adversarial Networks: The Basics

Posted on Tue 19 December 2017 in misc • Tagged with MNIST, GAN, generative, adversarial

Generative Adversarial Network Tutorial 00ΒΆ

In particle physics, we think adversarial networks are particularly interesting tools to explore discrepancies between our simulation and real detector data. In principle, the physics tools used for simulation of particle interaction data are quite good, but they are never perfect. Additionally, they are always imperfect in difficult to model ways. Many experiments spend a lot of time studying the differences between the output of their simulation and their real detector data, and a deep network that can learn these differences is really useful for making progress.

Continue reading