Neon

BIDSA Seminar: "Probabilistic symmetry and invariant neural networks"

Image of BIDSA Seminar: "Rough-glassy landscapes from inference to machine learning"
ROOM 3.E4.SR03
-
SPEAKER: Yee Whye Teh, University of Oxford

"Probabilistic symmetry and invariant neural networks"


17 April 2019, 12:30PM

Bocconi University, Room 3.e4.sr03

via Roentgen 1, 3rd floor


ABSTRACT

In an effort to improve the performance of deep neural networks in data-scarce, non-i.i.d., or unsupervised settings, much recent research has been devoted to encoding invariance under symmetry transformations into neural network architectures. We treat the neural network input and output as random variables, and consider group invariance from the perspective of probabilistic symmetry. Drawing on tools from probability and statistics, we establish a link between functional and probabilistic symmetry, and obtain functional representations of joint and conditional probability distributions that are invariant or equivariant under the action of a compact group. Those representations completely characterize the structure of neural networks that can be used to model such distributions and yield a general program for constructing invariant stochastic or deterministic neural networks. We develop the details of the general program for exchangeable sequences and arrays, recovering a number of recent examples as special cases.

Joint work with Ben Bloem-Reddy. Preprint at https://arxiv.org/abs/1901.06082


SPEAKER


Yee Whye Teh is a Professor of Statistical Machine Learning at the Department of Statistics, University of Oxford and a Research Scientist at DeepMind working on AI research. He is currently a European Research Council Consolidator Fellow. He obtained his Ph.D. at the University of Toronto (under Prof. Geoffrey E. Hinton), and did postdoctoral work at the University of California at Berkeley (under Prof. Michael I. Jordan) and National University of Singapore (as Lee Kuan Yew Postdoctoral Fellow). He was a Lecturer then a Reader at the Gatsby Computational Neuroscience Unit, UCL from Jan 2007-Aug 2012. His research interests are in machine learning and computational statistics, in particular probabilistic methods, Bayesian nonparametrics and deep learning. He develops novel models as well as efficient algorithms for inference and learning. He was programme co-chair (with Prof. Michael Titterington) of the International Conference on Artificial Intelligence and Statistics (AISTATS) 2010, programme co-chair (with Prof Doina Precup) of the International Conference on Machine Learning (ICML) 2017, and is/was an associate editor for Bayesian Analysis, IEEE Transactions on Pattern Analysis and Machine Intelligence, Machine Learning Journal, Statistical Sciences, Journal of the Royal Statistical Society Series B and Journal of Machine Learning Research. He has been area chair for NIPS, ICML and AISTATS on multiple occasions.