Aidan is a doctoral student of Yarin Gal and Yee Whye Teh at The University of Oxford. He leads a research group, called FOR.ai, focussing on providing resources, mentorship, and facilitating collaboration between academia and industry. Aidan's research deals in understanding and improving neural networks and their applications. Previously, he worked with Geoffrey Hinton and Łukasz Kaiser on the Google Brain team.
Targeted Dropout and Bitrot: Simple and Effective Techniques for Sparsification and Quantization
Targeted Dropout (TD) is a simple, easily implemented technique that can drastically improve the post hoc sparsification of neural networks. This talk will present TD and discuss some recent results in sparsification and quantization that suggest neural network over-parameterization presents great opportunity for both faster training and inference.