Normalizing Flows: the GAN's Comely Cousin

Chris Finlay

12:00, Friday, Nov. 29
BURN 1025



Probabilistic generative models arguably have the greatest whiz-bang factor of all the new-fangled deep learning fields. Who isn't besotted by a computer that can endlessly generate pictures of unwordly kittens?

In this pizza talk, I will discuss one of the less well-known generative methods, Normalizing Flows. I will show how this method arises quite easily from "one neat trick" (change of variables) while maximizing the likelihood of the data.

Then I will discuss how Normalizing Flows, when implemented with the world's favourite neural network, the ResNet, can be viewed as learning the dynamics of a transport map governed by an ordinary differential equation.

All graduate students are invited. As with all talks in the graduate student seminar, this talk will be accessible to all graduate students in math and stats.

This seminar was made possible by funding from the McGill mathematics and statistics department and PGSS.

back