Roy Frostig

PhD student
Stanford computer science

rf (at) cs


My research is in the realm of computational tools core to statistical machine learning.
A recent theme aims to match the performance of common, though often heavyweight, numerical methods using limited computational resources.

Percy Liang is my phd advisor.

Research papers
Principal component projection without principal component analysis.
Roy Frostig, Cameron Musco, Christopher Musco, Aaron Sidford.
arXiv preprint arXiv:1602.06872, 2016.
bib - paper
Toward deeper understanding of neural networks: the power of initialization and a dual view on expressivity.
Amit Daniely, Roy Frostig, Yoram Singer.
arXiv preprint arXiv:1602.05897, 2016.
bib - paper
Un-regularizing: approximate proximal point and faster stochastic algorithms for empirical risk minimization.
Roy Frostig, Rong Ge, Sham M. Kakade, Aaron Sidford.
International Conference on Machine Learning (ICML), 2015.
bib - paper
Competing with the empirical risk minimizer in a single pass.
Roy Frostig, Rong Ge, Sham M. Kakade, Aaron Sidford.
Conference on Learning Theory (COLT), 2015.
bib - paper
Simple MAP inference via low-rank relaxations.
Roy Frostig, Sida Wang, Percy Liang, Chris Manning.
Advances in Neural Information Processing Systems (NIPS), 2014.
bib - paper - experiments
Semantic parsing on Freebase from question-answer pairs.
Jonathan Berant, Andrew Chou, Roy Frostig, Percy Liang.
Empirical Methods in Natural Language Processing (EMNLP), 2013.
bib - paper - supplemental material - slides - project
Research notes
A sub-constant improvement in approximating the positive semidefinite Grothendieck problem.
Roy Frostig, Sida Wang.
Relaxations for inference in restricted Boltzmann machines.
Sida Wang, Roy Frostig, Percy Liang, Chris Manning.
International Conference on Learning Representations Workshop (ICLR), 2014.
Open-source involvements
Sempre is our semantic parsing toolkit.
Rust is a safe, concurrent systems programming language.