I am a fourth-year PhD student in the Department of Computer Science at Stanford University. I am fortunate to be supervised by Mert Pilanci. My research interests are in optimization for machine learning.
Previously, I completed a master's in computer science at the University of British Columbia, where I was advised by Mark Schmidt.
I am supported by an NSF Graduate Research Fellowship and an NSERC Postgraduate Scholarship (Doctoral).
April 4: I finally cleaned up the stochastic acceleration results from my MSc thesis. See my new arXiv paper with Mert Pilanci and Mark Schmidt for details.
March 8: Three new preprints on arXiv! We develop new theory for level-set teleportation, build up a new generalization of Lipschtiz gradients called directional smoothness, and study neural networks with scalar inputs.
Jan. 23 - Feb. 2: I visited the Flatiron Institute in New York City.
Jan. 25: I gave a talk at the MML Seminar on the solution sets of ReLU networks.
Nov. 27: I was recognized as a top reviewer for NeurIPS 2023.
Oct. 26: Two workshops papers accepted at NeurIPS OPT2023!
July 4: New arXiv paper on greedy rules for block-coordinate descent with affine constraints. The analysis is based on steepest descent in the 1-norm, leading to short and elegant proofs.
June 3: I will talk at the SIAM OP23 minisymposium on adaptivity in stochastic optimization.
June 2: New pre-print out! We derive the set of global optima for two-layer ReLU networks.
April. 24: Our work on solution sets and regularization paths of ReLU networks was accepted to ICML 2023! The preprint will be out soon.
April. 1: I will intern at the Flatiron Institute with Robert Gower and Alberto Bietti this summer.
Nov. 15: I had two submissions accepted to the NeurIPS OPT 2022 workshop!
Nov. 11: I was recognized as a top reviewer for NeurIPS 2022.
June 19-24: I attended the International HPC Summer School in Athens.
May 15: Our submission was accepted to ICML 2022!
April 21: I was recognized as a highlighted reviewer for ICLR 2022.
February 7: New preprint out! We develop convex optimization algorithms for training shallow ReLU networks by leveraging equivalent model classes and cone decompositions.
September 1: I finally moved to Stanford, CA. It's great to be here!
June 30: I was an expert reviewer and top reviewer for ICML 2021.
April 10: Mark Schmidt and Frederik Kunstner received best paper at AISTATS 2021 for their work on the convergence of expectation-maximization. Congratulations!
October 20: I was ranked in the top 10% of reviewers for NeurIPS this year.
September 15: I successfully "defended" my master's thesis on first-order optimization under interpolation conditions. The thesis is available here.
September 10: I have relocated (virtually) to Stanford to begin my PhD!
June 29 - July 10: I had a lot of fun attending the virtual MLSS this year — many thanks to the organizers for their hard work! My virtual poster is public on YouTube.
June 15: New preprint is on arXiv! I had great fun helping out with the experiments for this work on implicit regularization and preconditioners in generalized linear models.
April 29: I will attend the (virtual) MLSS 2020 from 28 June to 10 July this summer.
September 25: I'm organizing an overview of the interplay between optimization and generalization for UBC MLRG this semester! It includes "sharp" local minima, implicit regularization, and interpolation.
September 4: I ranked in the top 50% of reviewers for NeurIPS 2019! This was my first time reviewing; I reviewed eight papers, including two emergency reviews.
September 4: Our work on the stochastic Armijo line-search has been accepted for a poster at NeurIPS 2019!
May 1 - August 31: I'm interning with Cédric Archambeau and Matthias Seeger at Amazon Research in Berlin.
September 4: Our paper on low-rank Gaussian variational inference for Bayesian neural networks was accepted at NeurIPS 2018!
June 25 - 29: I was a teaching assistant for the approximate Bayesian inference tutorial at the 2018 Data Science Summer School at École Polytechnique. The tutorial resources are here.
January 21 - June 30: I joined Emtiyaz Khan as an intern at the RIKEN Center for Advanced Intelligence Project (AIP).
My email is and my GitHub is aaronpmishkin. My CV is here.