Learning and Learning to Learn at Scale

-
Abstract

Progress in deep learning is anchored on a tripod: increased computational power, more sophisticated neural network architectures, and growing datasets. Tremendous advances have been made, but we are yet to maximize what is possible. Training of neural networks for vision and language model applications are inhibited by the prohibitive cost of model exploration and model training. Scientific applications of interest to the US Department of Energy and US national laboratories face additional challenges: (1) scientific data are often hard to label and require domain expertise, (2) the measure of uncertainty required by predictive models, and (3) satisfying scientific constraints. Given these requirements and others, neural network architectures must be frequently tuned for specific scientific applications requiring extensive architecture search and hyperparameter tuning. In this talk, I will present our on-going work in large-scale training and architecture search of deep neural networks. I will discuss different parallelization techniques in an HPC-centric deep learning toolkit- Livermore Big Artificial Neural Networks (LBANN)- and its extension for fast, robust, and scalable neural architecture search. Experimental results from different application domains including small molecule antiviral drug design for CoVID-19 will be presented.

Bio
Sam Ade Jacobs is a computer scientist and a project lead in the Center for Applied Scientific Computing (CASC) at Lawrence Livermore National Laboratory (LLNL). He received his Ph.D. in Computer Science from Texas A&M University. Jacobs’ broad research experiences and interests include parallel computing, large-scale graph (data) analytics, scalable machine learning, and robotics. Jacobs’ current work focuses on developing new algorithms for large scale deep learning with applications in drug design, high energy physics, cosmology and global nuclear non-proliferation analysis. Jacobs’ work represents significant advances to the broader field of computational sciences as well as to (big) data science by providing and maintaining unique data analytics capabilities. His approaches and tools are designed to operate on massive supercomputers at LLNL (and other DOE labs) and are tested on those machines. Department Link: https://computing.llnl.gov/casc/

Description
This talk is a part of the Computing Research Leadership Council (CRLC) Virtual Series and hosted by the SIAM Student Chapter at Arizona State University (ASU) in collaboration with the SIAM Student Chapter at the University of California, Merced, SIAM Student Chapter at Virginia Tech and the SIAM Student Chapter at University of Arizona. 
Register early to qualify for "free lunch" Grubhub food credit on us!
 
Speaker

Sam Ade Jacobs
Computer scientist and Project Lead
Center for Applied Scientific Computing (CASC)
Lawrence Livermore National Laboratory (LLNL)

Location
Virtual via Zoom (advance registration required)