Large Scale Stochastic Training of Neural Networks

Lecture

Speaker(s)

Amir Gholami

Postdoctoral Scholar, Electrical Engineering and Computer Sciences, Postdoctoral Research Fellow, Berkeley Artificial Intelligence Research (BAIR) Lab, Postdoctoral Fellow, Foundations of Data Analysis (FODA) Institute, BIDS Data Science Fellow

Amir Gholami is a postdoctoral research fellow in Berkeley AI Research Lab working under supervision of Prof. Kurt Keutzer. He received his PhD in Computational Science and Engineering Mathematics from UT Austin, working with Prof. George Biros on bio-physics based image analysis, a research topic which recieved UT Austin’s best doctoral dissertation award in 2018. Amir has extensive experience in second-order optimization methods, image registration, inverse problems, and large scale parallel computing, developing codes that have been scaled up to 200K cores. He is a Melosh Medal finalist, recipient of best student paper award in SC'17, Gold Medal in the ACM Student Research Competition, as well as best student paper finalist in SC’14. His current research includes large scale training of Neural Networks, stochastic second-order optimization methods, and robust optimization.