Roy Frostig

I'm a research scientist at Google Brain, focusing on mathematical foundations, software systems, and experimentation core to statistical machine learning.

Percy Liang was my PhD advisor at Stanford, where I was part of the statistical machine learning group.


>research

Publications and preprints, also on scholar:

Measuring the effects of data parallelism on neural network training
Chris Shallue, Jaehoon Lee, Joseph Antognini, Jascha Sohl-Dickstein, Roy Frostig, George E. Dahl
arXiv preprint arXiv:1811.03600, 2018

Compiling machine learning programs via high-level tracing
Roy Frostig, Matthew Johnson, Chris Leary
SysML, 2018
Reports on a nascent version of JAX.

Random features for compositional kernels
Amit Daniely, Roy Frostig, Vineet Gupta, Yoram Singer
arXiv preprint arXiv:1703.07872, 2017

Estimation from indirect supervision with linear moments
Aditi Raghunathan, Roy Frostig, John Duchi, Percy Liang
International Conference on Machine Learning (ICML), 2016

Principal component projection without principal component analysis
Roy Frostig, Cameron Musco, Christopher Musco, Aaron Sidford
International Conference on Machine Learning (ICML), 2016

Toward deeper understanding of neural networks: the power of initialization and a dual view on expressivity
Amit Daniely, Roy Frostig, Yoram Singer
Neural Information Processing Systems (NeurIPS), 2016

Un-regularizing: approximate proximal point and faster stochastic algorithms for empirical risk minimization
Roy Frostig, Rong Ge, Sham M. Kakade, Aaron Sidford
International Conference on Machine Learning (ICML), 2015

Competing with the empirical risk minimizer in a single pass
Roy Frostig, Rong Ge, Sham M. Kakade, Aaron Sidford
Conference on Learning Theory (COLT), 2015

Simple MAP inference via low-rank relaxations
Roy Frostig, Sida Wang, Percy Liang, Chris Manning
Neural Information Processing Systems (NeurIPS), 2014

Semantic parsing on Freebase from question-answer pairs
Jonathan Berant, Andrew Chou, Roy Frostig, Percy Liang
Empirical Methods in Natural Language Processing (EMNLP), 2013
Corresponds to the initial version of SEMPRE.

>software projects

JAX – composable transformations of numerical Python programs: differentiate, vectorize, JIT-compile to accelerators, etc.

SEMPRE – a toolkit for training semantic parsers.

Rust – a language for reliable, concurrent systems programming, to which I contributed on course to version 0.1.