Bayesian learning and inference for models with co-evolving discrete and continuous latent states.
With Matt Johnson, Andy Miller, Ryan Adams, David Blei, and Liam Paninski. AISTATS, 2017.
Reparameterization gradients through rejection samplers for automatic variational inference in models with gamma, beta, and Dirichlet latent variables.
Received the Best Paper Award at AISTATS 2017!
With Christian Naesseth, Fran Ruiz, and David Blei.
Top-down and bottom-up methods are joined in a theory-driven analysis pipeline. We view theories as priors for statistical models, perform Bayesian inference, criticize, and revise.
With Sam Gershman. Current Opinion in Neurobiology, 2017.
We view SMC as a variational family indexed by the parameters of its proposal distribution and show how this generalizes the importance weighted autoencoder. As the number of particles goes to infinity, the variational approximation approaches the true posterior.
With Christian Naesseth, Rajesh Ranganath, and David Blei. arXiv, 2017.
My dissertation work at Harvard University on networks, point processes, and state space models for neural data analysis.
Received the 2016 Leonard J. Savage Award for Outstanding Dissertation in Applied Bayesian Methodology!
We combine network priors, nonlinear autoregressive models, and Pólya-gamma augmentation to reveal latent types and features of neurons using spike trains alone.
With Ryan Adams and Jonathan Pillow. NIPS, 2016.
We use a stick-breaking construction and Pólya-gamma augmentation to derive block Gibbs samplers for linear Gaussian models with multinomial observations.
With Matt Johnson and Ryan Adams. NIPS, 2015.
We propose a time-varying generalized linear model whose weights evolve according to synaptic plasticity rules, and we perform Bayesian inference with particle MCMC.
With Chris Stock and Ryan Adams. NIPS, 2014.
Combining Hawkes processes with generative network models to uncover latent patterns of influence.
With Ryan Adams. ICML, 2014.