I am a fifth-year PhD student in the Department of Computer Science at Stanford University. I am fortunate to be supervised by Mert Pilanci. My research interests are in optimization for machine learning.
Previously, I completed a master's in computer science at the University of British Columbia, where I was advised by Mark Schmidt.
I am supported by an NSF Graduate Research Fellowship and an NSERC Postgraduate Scholarship (Doctoral).
Sept. 25: One paper accepted at NeurIPS 2024. Many thanks to my fantastic collaborators!
Aug. 16: I visited MPI Intelligent Systems and the ELLIS Institute in Tübingen to give a talk on level set teleportation. The slides are available on Github.
Aug. 5: TMLR added me to their list of expert reviewers.
July 23: I was a recognized as a top reviewer for ICML 2024.
July 1 - Dec 31: I am interning with Francis Bach at Inria in Paris.
May 28 - June 4: I visited Emtiyaz Khan at RIKEN AIP in Tokyo and gave a talk on the solution set of two-layer ReLU networks.
April 30: I will intern on the Inria Sierra Team this summer. Drop me a line if you're in Paris!
April 16: I am organizing "optimization lunches" this quarter. Join our mailing list for details.
April 4: I finally cleaned up the stochastic acceleration results from my MSc thesis. See my new arXiv paper with Mert Pilanci and Mark Schmidt for details.
March 8: Three new preprints on arXiv! We develop new theory for level-set teleportation, build up a new generalization of Lipschtiz gradients called directional smoothness, and study neural networks with scalar inputs.
Jan. 23 - Feb. 2: I visited the Flatiron Institute in New York City.
Jan. 25: I gave a talk at the MML Seminar on the solution sets of ReLU networks.
Nov. 27: I was recognized as a top reviewer for NeurIPS 2023.
Oct. 26: Two workshops papers accepted at NeurIPS OPT2023!
July 4: New arXiv paper on greedy rules for block-coordinate descent with affine constraints. The analysis is based on steepest descent in the 1-norm, leading to short and elegant proofs.
June 3: I will talk at the SIAM OP23 minisymposium on adaptivity in stochastic optimization.
June 2: New pre-print out! We derive the set of global optima for two-layer ReLU networks.
April. 24: Our work on solution sets and regularization paths of ReLU networks was accepted to ICML 2023! The preprint will be out soon.
April. 1: I will intern at the Flatiron Institute with Robert Gower and Alberto Bietti this summer.
Nov. 15: I had two submissions accepted to the NeurIPS OPT 2022 workshop!
Nov. 11: I was recognized as a top reviewer for NeurIPS 2022.
June 19-24: I attended the International HPC Summer School in Athens.
May 15: Our submission was accepted to ICML 2022!
April 21: I was recognized as a highlighted reviewer for ICLR 2022.
February 7: New preprint out! We develop convex optimization algorithms for training shallow ReLU networks by leveraging equivalent model classes and cone decompositions.
September 1: I finally moved to Stanford, CA. It's great to be here!
June 30: I was an expert reviewer and top reviewer for ICML 2021.
April 10: Mark Schmidt and Frederik Kunstner received best paper at AISTATS 2021 for their work on the convergence of expectation-maximization. Congratulations!
October 20: I was ranked in the top 10% of reviewers for NeurIPS this year.
September 15: I successfully "defended" my master's thesis on first-order optimization under interpolation conditions. The thesis is available here.
September 10: I have relocated (virtually) to Stanford to begin my PhD!
June 29 - July 10: I had a lot of fun attending the virtual MLSS this year — many thanks to the organizers for their hard work! My virtual poster is public on YouTube.
June 15: New preprint is on arXiv! I had great fun helping out with the experiments for this work on implicit regularization and preconditioners in generalized linear models.
April 29: I will attend the (virtual) MLSS 2020 from 28 June to 10 July this summer.
September 25: I'm organizing an overview of the interplay between optimization and generalization for UBC MLRG this semester! It includes "sharp" local minima, implicit regularization, and interpolation.
September 4: I ranked in the top 50% of reviewers for NeurIPS 2019! This was my first time reviewing; I reviewed eight papers, including two emergency reviews.
September 4: Our work on the stochastic Armijo line-search has been accepted for a poster at NeurIPS 2019!
May 1 - August 31: I'm interning with Cédric Archambeau and Matthias Seeger at Amazon Research in Berlin.
September 4: Our paper on low-rank Gaussian variational inference for Bayesian neural networks was accepted at NeurIPS 2018!
June 25 - 29: I was a teaching assistant for the approximate Bayesian inference tutorial at the 2018 Data Science Summer School at École Polytechnique. The tutorial resources are here.
January 21 - June 30: I joined Emtiyaz Khan as an intern at the RIKEN Center for Advanced Intelligence Project (AIP).
My email is and my GitHub is aaronpmishkin. My CV is here.
The darkness through which we are groping is too thick for us to make any pronouncements about it; we cannot even say that it is doomed to last. — C. Lévy-Strauss