Ph.D. Student, Computer Science,
About Me
Hi! I am a second-year Ph.D. student in the Computer Science Department at Carnegie Mellon University supervised by Prof. Zico Kolter. I am broadly interested in all aspects of machine learning, both theoretical and applied. I have four years of industry experience operationalizing contemporary machine learning research for real-world applications. This experience has highlighted several potentially valuable research opportunities and holes in the literature, some of which I am currently exploring. My current research interests are robust deep learning, distributional shift, and adversarial learning.
Education
Ph.D. in Computer Science. (GPA 0.00 / 0.00)
Aug 2021 - Current
Carnegie Mellon University Pittsburgh, PA
M.S. in Statistics. (GPA 3.77 / 4.00)
Mar 2015 - Jun 2017
Stanford University Palo Alto, CA
B.S. in Computer Science. (GPA 3.54 / 4.00)
Sep 2013 - Jun 2017
Stanford University Palo Alto, CA
Experience
I performed research in the AutoML / NAS and Fairness in ML domains. We wrote five papers based on this work.
I designed and implemented scalable deep learning architectures including LSTM forecasting models, AutoML / NAS regression and classification models, GAN data augmentation models, and VAE anomaly detection models among others.
I worked on improving contemporary statistical learning and applied graph theory models for natural language applications. The machine intelligence algorithms I developed help decipher global news data.
Research Intern at Andrew Ng's Artificial Intelligence Lab
Stanford University, CA
Jul 2015 - Sep 2015
I worked on the system infrastructure and CUDA code for a hybrid CNN and LSTM architecture designed to instantly detect and semantically segment images and videos with multiple stimuli.
Cofounder (CTO) at Ebotic, Inc.
Palo Alto, CA
Jul 2014 - Dec 2015
I worked with an international team to develop an intelligent drone platform that applied advanced flight technologies, SLAM, and deep learning for improved flight stability and awareness.
Research Intern at Sebastian Thrun's Artificial Intelligence Lab
Stanford University, CA
Jun 2014 - Aug 2014
I improved the performance of machine learning algorithms for smart home applications by adding thermal image descriptors into a robotics pipeline.
Skills
Commonly used skills are highlighted.
Computer Languages
Python, Julia, C / C++, CUDA, Javascript, R, Java, MATLAB, Racket, Haskell, LaTeX, SQL, NoSQL, and HTML5 / CSS3.
Frameworks / Tools
PyTorch, TensorFlow, NumPy, Matplotlib, Jupyter, SpaCy, Nltk, AllenNLP, Linux, AWS, GCP, Docker, Git, Visual Studio Code, Vim, React, Redux, Webpack, Flask, Blender, Photoshop, and Figma.
Other Interests
Analysis, algebra, topology, incentive theory, economics, cognitive science, neuroscience, videography, scuba diving, rock climbing, and fitness.
Conference Publications
2022
Deep Equilibrium Optical Flow Estimation.
S. Bai, Z. Geng, Y. Savani, J. Z. Kolter.
Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2022.
2021
NAS-Bench-x11 and the Power of Learning Curves.
S. Yan, C. White, Y. Savani, F. Hutter.
Advances in Neural Information Processing Systems (NeurIPS) 2021.
Exploring the Loss Landscape in Neural Architecture Search.
C. White, S. Nolen, Y.Savani.
Conference on Uncertainty in Artificial Intelligence (UAI). PMLR, 2021.
BANANAS: Bayesian Optimization with Neural Architectures for Neural Architecture Search.
C. White, W. Neiswanger, Y. Savani.
Proceedings of the AAAI Conference on Artificial Intelligence (AAAI) 2021.
2020
Intra-Processing Methods for Debiasing Neural Networks.
Y. Savani, C. White, N. S. Govindarajulu.
Advances in Neural Information Processing Systems (NeurIPS) 2020.
A Study on Encodings for Neural Architecture Search.
C. White, W. Neiswanger, S. Nolen, Y. Savani.
Advances in Neural Information Processing Systems (NeurIPS) 2020.
Workshop Publications
2020
A Study on Encodings for Neural Architecture Search.
PosterC. White, W. Neiswanger, S. Nolen, Y. Savani.
ICML Workshop on AutoML, 2020.
Local Search is State of the Art for Neural Architecture Search Benchmarks.
PosterC. White, S. Nolen, Y. Savani.
ICML Workshop on AutoML, 2020.
2019
Neural Architecture Search via Bayesian Optimization with a Neural Network Prior.
C. White, W. Neiswanger, Y. Savani.
NeurIPS Workshop on Meta Learning, 2019.
Deep Uncertainty Estimation for Model-based Neural Architecture Search.
C. White, W. Neiswanger, Y. Savani.
NeurIPS Workshop on Bayesian Deep Learning, 2019.
Talks
2020
NeurIPS 2020 Short Presentation
Oct 2020
Intra-Processing Methods for Debiasing Neural Networks.
Abacus.AI Workshop
Aug 2020
Explainability and bias in Neural Nets.
AICamp Workshop
May 2020
Unsupervised Learning & Deep Learning Based Forecasting.
Abacus.AI Talk
Feb 2020
Anomaly Detection.
2019
Abacus.AI Talk
Aug 2019
XLNET: The State-of-the-Art in Language Models.