Ben Poole

I'm a PhD student working with Surya Ganguli at Stanford University in the Neural Dynamics and Computation lab. My research aims to develop a better understanding of both biological and aritficial neural networks.

I did my undergrad at Carnegie Mellon University, where I was advised by Tai Sing Lee. I've worked at Google Brain, Google Research, Intel Research Pittsburgh, NYU Center for Neural Science, and the US Army Research Lab.

Email  /  Twitter  /  CV  /  Scholar  /  GitHub  /  LinkedIn

Research

I'm interested in computational neuroscience, machine learning, computer vision, optimization, cycling, and deep learning. My current focus is on improving algorithms for unsupervised and semi-supervised learning using generative models.

Time-warped PCA: simultaneous alignment and dimensionality reduction of neural data
Ben Poole, Alex Williams, Niru Maheswaranathan, Byron Yu, Gopal Santhanam, Stephen Ryu, Stephen Baccus, Krishna Shenoy, Surya Ganguli
To appear at COSYNE, 2017  
abstract

Extends dimensionality reduction techniques to account for trial-to-trial variability in timing.

Improved generator objectives for GANs
Ben Poole, Alex Alemi, Jascha Sohl-Dickstein, Anelia Angelova
NIPS Workshop on Adversarial Training, 2016  
arXiv / poster

A new take on the Generative Adversarial Network training procedure.

Categorical Reparameterization with Gumbel-Softmax
Eric Jang, Shane Gu, Ben Poole
NIPS Bayesian Deep Learning Workshop, 2016  
arXiv / blog post

Efficient gradient estimator for categorical variables.

Unrolled Generative Adversarial Networks
Luke Metz, Ben Poole, David Pfau, Jascha Sohl-Dickstein
NIPS Workshop on Adversarial Training, 2016  
arXiv / code

Stabilize GANs by defining the generator objective with respect to an unrolled optimization of the discriminator.

Exponential expressivity in deep neural networks through
transient chaos

Ben Poole, Subhaneil Lahiri, Maithra Raghu, Jascha Sohl-Dickstein, Surya Ganguli
Neural Information Processing Systems (NIPS), 2016  
arXiv / code / poster

Random neural networks have curvature that grows exponentially with depth.

On the expressive power of deep neural networks
Maithra Raghu, Ben Poole, Jon Kleinberg, Surya Ganguli, Jascha Sohl-Dickstein
arXiv

Random neural networks show exponential growth in activation patterns and more.

Adversarially learned inference
Vincent Dumoulin, Ishmael Belghazi, Ben Poole, Alex Lamb, Martin Arjovsky, Olivier Mastropietro, Aaron Courville
arXiv / project page

Jointly learn a generative model and an inference network through an adversarial process.

Direction Selectivity in Drosophila Emerges from Preferred-Direction Enhancement and Null-Direction Suppression
Jonathan Leong*, Jennifer Esch*, Ben Poole*, Surya Ganguli, Thomas Clandinin
* equal contribution
The Journal of Neuroscience, 2016

Fruit flies detect motion using a very similar algorithm to humans.

The Fast Bilateral Solver
Jonathan T. Barron, Ben Poole
European Conference on Computer Vision (ECCV), 2016   (oral presentation)
arXiv / supplement / code / bibtex

Fast and accurate edge-aware smoothing. Differentiable for all your deep learning needs.

Fast large-scale optimization by unifying stochastic gradient and quasi-newton methods
Jascha Sohl-Dickstein, Ben Poole, Surya Ganguli
International Conference on Machine Learning (ICML), 2014
arXiv / code

Speed up quasi-newton methods by maintaining a low-dimensional approximation of the Hessian for each minibatch.

Analyzing noise in auotoencoders and deep networks
Ben Poole, Jascha Sohl-Dickstein, Surya Ganguli
NIPS Workshop on Deep Learning, 2013
arXiv

Derives analytic regularizers for different forms of noise injection, and shows how alternative types of additive noise can improve over dropout.

Brain Regions Engaged by Part- and Whole-task Performance in a Video Game: A Model-based Test of the Decomposition Hypothesis
John Anderson, Daniel Bothell, Jon Fincham, Abraham Anderson, Ben Poole, Yulin Qin
Journal of Cognitive Neuroscience, 2011

Complex tasks, like the Space Fortress video game, can be decomposed into a set of independent reusable components.

Abstracts

Robust non-rigid alignment of volumetric calcium imaging data
Ben Poole, Logan Grosenick, Michael Broxton, Karl Deisseroth, Surya Ganguli
Computational Systems Neuroscience (COSYNE), 2015
poster

Correct for translations and rotations of noisy volumetric data without a clean reference volume.

Connecting scene statistics to probabilistic population codes
and tuning properties of V1 neurons

Ben Poole, Ian Lenz, Grace Linsday, Jason Samonds, Tai Sing Lee
Society for Neuroscience (SFN), 2010 (oral presentation)


cloned from clone