Image Not Available

Stefano Ermon

Assistant Professor of Computer Science
Fellow of the Woods Institute for the Environment
Stanford University

Office: Gates Building #228

Phone: 650 498 9942

Email: ermon AT cs.stanford.edu

About Me

I am an Assistant Professor in the Department of Computer Science at Stanford University, where I am affiliated with the Artificial Intelligence Laboratory and a fellow of the Woods Institute for the Environment. My research is centered on techniques for scalable and accurate inference in graphical models, statistical modeling of data, large-scale combinatorial optimization, and robust decision making under uncertainty, and is motivated by a range of applications, in particular ones in the emerging field of computational sustainability.

Research Interests

Teaching

[return]

Publications

Conference Papers (Refereed and Archived)

  1. Neal Jean, Marshall Burke, Michael Xie, Matthew Davis, David Lobell, Stefano Ermon
    Combining Satellite Imagery and Machine Learning to Predict Poverty [PDF] [Project Website] [Commentary] [Nature Research Highlights] [Code]
    Science. Science, 353(6301), 790-794, 2016.
  2. Jonathan Ho, Stefano Ermon
    Generative Adversarial Imitation Learning
    NIPS-16. To appear
  3. Aditya Grover, Stefano Ermon
    Variational Bayes on Monte Carlo Steroids
    NIPS-16. To appear
  4. Shengjia Zhao, Enze Zhou, Ashish Sabharwal, Stefano Ermon
    Adaptive Concentration Inequalities for Sequential Decision Problems
    NIPS-16. To appear
  5. Yexiang Xue, Zhiyuan Li, Stefano Ermon, Carla Gomes, Bart Selman
    Solving Marginal MAP Problems with NP Oracles and Parity Constraints
    NIPS-16. To appear
  6. Mitchell McIntire, Daniel Ratner, Stefano Ermon
    Sparse Gaussian Processes for Bayesian Optimization [PDF]
    UAI-16.
  7. Jonathan Ho, Jayesh Gupta, Stefano Ermon
    Model-Free Imitation Learning with Policy Optimization [PDF]
    ICML-16. In Proc. 33rd International Conference on Machine Learning, 2016.
  8. Yexiang Xue, Stefano Ermon, Ronan Le Bras, Carla Gomes, Bart Selman
    Variable Elimination in the Fourier Domain [PDF]
    ICML-16. In Proc. 33rd International Conference on Machine Learning, 2016.
  9. Steve Mussmann, Stefano Ermon
    Learning and Inference via Maximum Inner Product Search [PDF]
    ICML-16. In Proc. 33rd International Conference on Machine Learning, 2016.
  10. Tudor Achim, Ashish Sabharwal, Stefano Ermon
    Beyond Parity Constraints: Fourier Analysis of Hash Functions for Inference [PDF]
    ICML-16. In Proc. 33rd International Conference on Machine Learning, 2016.
  11. Lun-Kai Hsu, Tudor Achim, Stefano Ermon
    Tight Variational Bounds via Random Projections and I-Projections [PDF]
    AISTATS-16.
  12. Michael Xie, Neal Jean, Marshall Burke, David Lobell, Stefano Ermon
    Transfer Learning from Deep Features for Remote Sensing and Poverty Mapping [PDF] [Stanford Report] [NYTimes]
    AAAI-16. In Proc. 30th AAAI Conference on Artificial Intelligence, 2016.
  13. Shengjia Zhao, Sorathan Chaturapruek, Ashish Sabharwal, Stefano Ermon
    Closing the Gap Between Short and Long XORs for Model Counting [PDF] [Code]
    AAAI-16. In Proc. 30th AAAI Conference on Artificial Intelligence, 2016.
  14. Carolyn Kim, Ashish Sabharwal, Stefano Ermon
    Exact Sampling with Integer Linear Programs and Random Perturbations [PDF] [Code]
    AAAI-16. In Proc. 30th AAAI Conference on Artificial Intelligence, 2016.
  15. Stefan Hadjis, Stefano Ermon
    Importance sampling over sets: a new probabilistic inference scheme. [PDF] [Code]
    UAI-15. In Proc. Conference on Uncertainty in Artificial Intelligence, 2015.
  16. Michael Zhu, Stefano Ermon
    A Hybrid Approach for Probabilistic Inference using Random Projections. [PDF]
    ICML-15. In Proc. 32nd International Conference on Machine Learning, 2015.
  17. Yexiang Xue, Stefano Ermon, Carla Gomes, Bart Selman
    Uncovering Hidden Structure through Parallel Problem Decomposition for the Set Basis Problem with Application to Materials Discovery. [PDF]
    IJCAI-15. In Proc. International Joint Conference on Artificial Intelligence, 2015.
  18. Stefano Ermon, Yexiang Xue, Russell Toth, Bistra Dilkina, Richard Bernstein, Theodoros Damoulas, Patrick Clark, Steve DeGloria, Andrew Mude, Christopher Barrett, and Carla Gomes
    Learning Large Scale Dynamic Discrete Choice Models of Spatio-Temporal Preferences with Application to Migratory Pastoralism in East Africa. [PDF]
    AAAI-15. In Proc. 29th AAAI Conference on Artificial Intelligence, 2015.
  19. Stefano Ermon, Ronan Le Bras, Santosh Suram, John M. Gregoire, Carla Gomes, Bart Selman, and Robert B. van Dover
    Pattern Decomposition with Complex Combinatorial Constraints: Application to Materials Discovery. [PDF]
    AAAI-15. In Proc. 29th AAAI Conference on Artificial Intelligence, 2015.
  20. Stefano Ermon, Carla Gomes, Ashish Sabharwal, and Bart Selman.
    Designing Fast Absorbing Markov Chains. [PDF]
    AAAI-14. In Proc. 28th AAAI Conference on Artificial Intelligence , July 2014.
  21. Stefano Ermon, Carla Gomes, Ashish Sabharwal, and Bart Selman.
    Low-density Parity Constraints for Hashing-Based Discrete Integration. [PDF] [Code]
    ICML-14. In Proc. 31st International Conference on Machine Learning, June 2014.
  22. Stefano Ermon, Carla Gomes, Ashish Sabharwal, and Bart Selman.
    Embed and Project: Discrete Sampling with Universal Hashing. [PDF] [Code]
    NIPS-13. In Proc. 27th Annual Conference on Neural Information Processing Systems, December 2013.
  23. Stefano Ermon, Carla Gomes, Ashish Sabharwal, and Bart Selman.
    Optimization With Parity Constraints: From Binary Codes to Discrete Integration. [PDF] [Slides] [Poster] [Code]
    UAI-13. In Proc. 29th Conference on Uncertainty in Artificial Intelligence, July 2013.
    [Oral presentation, acceptance rate: 26/233 (11%)]
    Best student paper award.
    Best paper award runner-up.
  24. Stefano Ermon, Carla Gomes, Ashish Sabharwal, and Bart Selman.
    Taming the Curse of Dimensionality: Discrete Integration by Hashing and Optimization. [PDF] [Slides] [Code]
    ICML-13. In Proc. 30th International Conference on Machine Learning, June 2013.
    [Full oral presentation, acceptance rate 143/1204 (12%)]
  25. Stefano Ermon, Carla Gomes, Ashish Sabharwal, and Bart Selman.
    Density Propagation and Improved Bounds on the Partition Function. [PDF] [Poster]
    NIPS-12. In Proc. 26th Annual Conference on Neural Information Processing Systems, December 2012.
    [Full paper, acceptance rate: 370/1467 (25%)]
  26. Stefano Ermon, Carla Gomes, and Bart Selman.
    Uniform Solution Sampling Using a Constraint Solver As an Oracle. [PDF] [Slides] [Code]
    UAI-12. In Proc. 28th Conference on Uncertainty in Artificial Intelligence, August 2012.
    [Oral presentation, acceptance rate: 24/304 (8%)]
  27. Liaoruo Wang, Stefano Ermon, and John Hopcroft.
    Feature-Enhanced Probabilistic Models for Diffusion Network Inference. [PDF] [Slides] [Code]
    ECML-PKDD-12. In Proc. of European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, September 2012.
    [Full paper, acceptance rate: 105/443 (24%)]
  28. Stefano Ermon, Yexiang Xue, Carla Gomes, and Bart Selman.
    Learning Policies For Battery Usage Optimization in Electric Vehicles. [PDF] [Slides]
    ECML-PKDD-12. In Proc. of European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, September 2012.
    [Full paper, acceptance rate: 105/443 (24%)]
  29. Stefano Ermon, Ronan Le Bras, Carla Gomes, Bart Selman, and Bruce van Dover.
    SMT-Aided Combinatorial Materials Discovery. [PDF]
    SAT-12. In Proc. 15th International Conference on Theory and Applications of Satisfiability Testing, June 2012.
    [Full paper, acceptance rate: 29/88 (33%)]
  30. Stefano Ermon, Carla Gomes, Bart Selman, and Alexander Vladimirsky.
    Probabilistic Planning With Non-linear Utility Functions and Worst Case Guarantees. [PDF]
    AAMAS-12. In Proc. 11th International Conference on Autonomous Agents and Multiagent Systems, June 2012.
    [Full paper, acceptance rate: 137/671 (20.5%)]
  31. Stefano Ermon, Carla Gomes, Ashish Sabharwal, and Bart Selman.
    Accelerated Adaptive Markov Chain for Partition Function Computation. [PDF]
    NIPS-11. In Proc. 25th Annual Conference on Neural Information Processing Systems, December 2011.
    [Spotlight presentation, acceptance rate: 66/1400 (4.7%)]
  32. Stefano Ermon, Carla Gomes, and Bart Selman.
    A Flat Histogram Method for Computing the Density of States of Combinatorial Problems. [PDF]
    IJCAI-11. In Proc. 22nd International Joint Conference on Artificial Intelligence, July 2011. Invited paper.
  33. Stefano Ermon, Jon Conrad, Carla Gomes, and Bart Selman.
    Risk-Sensitive Policies for Sustainable Renewable Resource Allocation. [PDF]
    IJCAI-11. In Proc. 22nd International Joint Conference on Artificial Intelligence, July 2011.
    [Oral presentation, acceptance rate: 227/1325 (17%)]
  34. Stefano Ermon, Carla Gomes, and Bart Selman.
    A Message Passing Approach to Multiagent Gaussian Inference for Dynamic Processes (Short Paper). [PDF]
    AAMAS-11. In Proc. 10th International Conference on Autonomous Agents and Multiagent Systems, May 2011.
  35. Stefano Ermon, Carla Gomes, and Bart Selman. Computing the Density of States of Boolean Formulas. [PDF] [Slides]
    CP-10. In Proc. 16th International Conference on Principles and Practice of Constraint Programming, September 2010.
    Best student paper award.
  36. Stefano Ermon, Jon Conrad, Carla Gomes, and Bart Selman.
    Playing Games against Nature: Optimal Policies for Renewable Resource Allocation. [PDF]
    UAI-10. In Proc. 26th Conference on Uncertainty in Artificial Intelligence, July 2010.
    [Plenary oral presentation, acceptance rate: 30/260 (11.5%)]
    (also presented at the 2nd International Conference on Computational Sustainability, June 2010, the 1st DIMACS
    Workshop on Conservation Biology
    , July 2010, and the 3rd International Workshop on Constraint Reasoning and
    Optimization for Computational Sustainability
    , September 2010)
  37. Stefano Ermon, Carla Gomes, and Bart Selman.
    Collaborative Multiagent Gaussian Inference in a Dynamic Environment Using Belief Propagation (Short Paper). [PDF]
    AAMAS-10. In Proc. 9th International Conference on Autonomous Agents and Multiagent Systems, May 2010.
  38. Stefano Ermon, Luca Schenato, and Sandro Zampieri. [PDF]
    Trust Estimation in Autonomic Networks: a Statistical Mechanics Approach.
    CDC-09. In Proc. 48th IEEE Conference on Decision and Control (CDC), December 2009.

Journal Articles

  1. Stefano Ermon, Yexiang Xue, Carla Gomes, and Bart Selman.
    Learning Policies For Battery Usage Optimization in Electric Vehicles.
    In Machine Learning: Volume 92, Issue 1 (2013), Page 177-194 , 2013

[return]

Honors and Awards

[return]

Professional Service

[return]

Recent Invited Talks

[return]

Software and Benchmarks

WISH
WISH is a general algorithm to approximate (with high probability and within any desired degree of accuracy) discrete weighted sums defined over exponentially large sets of items. This implementation is specifically designed to approximate the partition function (normalization constant) of discrete probabilistic graphical models, by (approximately) solving a small number of optimization instances (maximum likelihood queries) using a combinatorial optimization package.
  • Source [UAI 2015]
  • Binaries (Linux, 64 bit) of WISH version 1.0 [ICML 2013]
  • Source [UAI 2013]
  • Instances used to evaluate the WISH algorithm [ICML 2013]
       
MoNet
MoNet is an algorithm to infer latent network structure based on observations of textual "cascades" spreading over the network. For example, MoNet can be used to infer a following relationship (who is following whom) on Twitter by observing sequences of tweets. It first defines a probabilistic model that specifically takes into account time and textual information of the messages, and then infers the most likely underying network structure by solving a sequence of convex optimization problems.
  • MoNet source code for inferring latent network structure based on textual cascades [ECML-PKDD 2012]
       
      

Combinatorial Materials Discovery Benchmark Instances
In combinatorial materials discovery, materials scientists search for new materials with desirable physical properties by obtaining x-ray diffraction measurements on hundreds of samples from a composition spread. We integrated domain-specific scientific background knowledge about the physical and chemical properties of the materials into a Satisfiability Modulo Theories (SMT) reasoning framework based on linear arithmetic. Using a novel encoding, state-of-the-art SMT solvers can automatically analyze large synthetic datasets, and generate interpretations that are physically meaningful and very accurate, even in the presence of noise.
  • SMT instances (SMTLib Version 2.0) for the phase map identification of the Al-Li-Fe synthetic system [SAT 2012]
       

Search Tree Sampler
SearchTreeSampler is a sampling technique that, while enforcing an approximately uniform exploration of the search space, leverages the reasoning power of a systematic constraint solver in a black-box scheme. They key idea is to explore the search tree uniformly in a breadth-first way, subsampling a subset of representative nodes at each level. The number of nodes kept at each level is a parameter used to trade off uniformity with computational complexity. The samples provided by STS can then be used to estimate the number of solutions of the problem (partition function).
  • Source code of Search Tree Sampler (with MiniSAT as an NP-oracle). [UAI 2012]
       

Energy Demand in Commuter Trips Dataset
We produced a dataset of energy demand profiles for commuter trips acros the US by processing the raw data originally collected by the ChargeCar project at CMU. We used this dataset to train an intelligent energy management system for electric vehicles, based on a combination of optimization, MDPs, and supervised learning techniques. The new approach significantly outperforms the leading algorithms that were previously proposed as part of an open algorithmic challenge.
  • Dataset of crowdsourced commuter trips acroos the US (generated processing the raw data originally collected by ChargeCar) [ECML 2012]
       

Flat Histogram Sampling Code
We investigated the use of advanced flat histogram sampling techniques from statistical physics to explore large combinatorial spaces. This is a class of adaptive MCMC (Markov Chain Monte Carlo) methods that can adapt transition probabilities based on the chain history. Intuitively, this allows the chain to escape from local minima, which can be very helpful for difficult energy landscapes. Further, we introduced a focused component (inspired by local search combinatorial optimization) that can leverage problem structure and speed up convergence.
       
Educational Software
Free program I developed with help of M.D. friends to assist in the preparation of the Italian Medical Licensing Natioanl Exam. Users can practice with a user-friendly GUI and over 6,000 multiple-choice questions, and focus on the most challenging subjects based on statistics collected by the program.
       
               

[return]

Personal

You can find out more about me and see some pictures here.

[return]


©2014 Stefano Ermon