BIDS Machine Learning and Science Forum
Date: Monday, February 22, 2021
Time: 11:00 AM - 12:00 PM Pacific Time
Location: Participate remotely using this Zoom link
Noisy Recurrent Neural Networks
Speaker: Soon Hoe Lim, KTH
Recurrent neural networks (RNNs) constitute a class of powerful brain-inspired models widely used in machine learning for analyzing sequential data. Recent efforts have shown how adding noise into the networks can improve stability during training, and consequently improve robustness with respect to data perturbations. In this talk, we discuss a general framework for studying RNNs trained by injecting noise into the hidden states. Specifically, we consider RNNs that can be viewed as discretizations of stochastic differential equations driven by input data. This framework allows us to study the implicit regularization effect of general noise injection schemes. Our theory is supported by empirical results which demonstrate improved robustness of noise injected RNN classifiers with respect to various input perturbations, while maintaining state-of-the-art performance.
The BIDS Machine Learning and Science Forum meets biweekly to discuss current applications across a wide variety of research domains in the physical sciences and beyond. These active sessions bring together domain scientists, statisticians, and computer scientists who are either developing state-of-the-art methods or are interested in applying these methods in their research. This Forum is organized by BIDS Faculty Affiliate Uroš Seljak (professor of Physics at UC Berkeley), BIDS Research Affiliate Ben Nachman (Physicist at Lawrence Berkeley National Laboratory), Vanessa Böhm and Ben Erichson. All interested members of the UC Berkeley and Berkeley Lab communities are welcome and encouraged to attend. To receive email notifications about upcoming meetings, or to request more information, please contact berkeleymlforum@gmail.com.
Speaker(s)
