scott.linderman@columbia.edu

Scott W. Linderman

Postdoctoral Fellow, Columbia University

I'm a postdoctoral fellow in the labs of David Blei and Liam Paninski at Columbia University. I completed my Ph.D. in Computer Science at Harvard University under the supervision of Ryan Adams and Leslie Valiant. My research is focused on machine learning, computational neuroscience, and the general question of how computer science and statistics can help us decipher neural computation. I've worked on bottom-up methods for discovering interpretable structure in large-scale neural recordings as well as top-down models of biological computation.

Here are a few highlights from my recent work.

Recurrent Switching Linear Dynamical Systems

Bayesian learning and inference for models with co-evolving discrete and continuous latent states.

With Matt Johnson, Andy Miller, Ryan Adams, David Blei, and Liam Paninski. AISTATS, 2017.

Rejection Sampling Variational Inference

Reparameterization gradients through rejection samplers for automatic variational inference in models with gamma, beta, and Dirichlet latent variables.

Received the Best Paper Award at AISTATS 2017!

With Christian Naesseth, Fran Ruiz, and David Blei.

Variational Sequential Monte Carlo

We view SMC as a variational family indexed by the parameters of its proposal distribution and show how this generalizes the importance weighted autoencoder. As the number of particles goes to infinity, the variational approximation approaches the true posterior.

With Christian Naesseth, Rajesh Ranganath, and David Blei. arXiv, 2017.

Dependent Multinomial Models Made Easy

We use a stick-breaking construction and Pólya-gamma augmentation to derive block Gibbs samplers for linear Gaussian models with multinomial observations.

With Matt Johnson and Ryan Adams. NIPS, 2015.

Studying Synaptic Plasticity with Time-Varying GLMs

We propose a time-varying generalized linear model whose weights evolve according to synaptic plasticity rules, and we perform Bayesian inference with particle MCMC.

With Chris Stock and Ryan Adams. NIPS, 2014.

Publications

2017

  1. Naesseth, C. A., Linderman, S. W., Ranganath, R., & Blei, D. M. (2017). Variational Sequential Monte Carlo. ArXiv Preprint ArXiv:1705.11140.
    arXiv Code
  2. Linderman, S. W., & Gershman, S. J. (2017). Using computational theory to constrain statistical models of neural data. Current Opinion in Neurobiology, 46, 14–24.
    Paper bioRxiv Code
  3. Linderman*, S. W., Johnson*, M. J., Miller, A. C., Adams, R. P., Blei, D. M., & Paninski, L. (2017). Recurrent switching linear dynamical systems. In Proceedings of the 20th International Conference on Artificial Intelligence and Statistics (AISTATS).
    Paper Slides Talk Code
  4. Naesseth, C. A., Ruiz, F. J. R., Linderman, S. W., & Blei, D. M. (2017). Reparameterization Gradients through Acceptance-Rejection Sampling Algorithms. In Proceedings of the 20th International Conference on Artificial Intelligence and Statistics (AISTATS). Best Paper Award.
    Paper Blog Post Code

2016

  1. Naesseth, C. A., Ruiz, F. J. R., Linderman, S. W., & Blei, D. M. (2016). Rejection sampling variational inference. Advances in Approximate Bayesian Inference NIPS Workshop.
  2. Chen, Z., Linderman, S. W., & Wilson, M. A. (2016). Bayesian Nonparametric Methods For Discovering Latent Structures Of Rat Hippocampal Ensemble Spikes. IEEE Workshop on Machine Learning for Signal Processing.
    Paper Code
  3. Elibol, H. M., Nguyen, V., Linderman, S. W., Johnson, M. J., Hashmi, A., & Doshi-Velez, F. (2016). Cross-Corpora Unsupervised Learning of Trajectories in Autism Spectrum Disorders. Journal of Machine Learning Research, 17(133), 1–38.
    Paper
  4. Linderman, S. W., Adams, R. P., & Pillow, J. W. (2016). Bayesian latent structure discovery from multi-neuron recordings. In Advances in Neural Information Processing Systems (NIPS).
    Paper Code
  5. Linderman, S. W. (2016). Bayesian methods for discovering structure in neural spike trains (PhD thesis). Harvard University. Leonard J. Savage Award for Outstanding Dissertation in Applied Bayesian Methodology from the International Society for Bayesian Analysis
    Thesis Code
  6. Linderman, S. W., Johnson, M. J., Wilson, M. A., & Chen, Z. (2016). A Bayesian nonparametric approach to uncovering rat hippocampal population codes during spatial navigation. Journal of Neuroscience Methods, 263, 36–47.
    Paper Code
  7. Linderman, S. W., Tucker, A., & Johnson, M. J. (2016). Bayesian Latent State Space Models of Neural Activity. Computational and Systems Neuroscience (Cosyne) Abstracts.

2015

  1. Linderman*, S. W., Johnson*, M. J., & Adams, R. P. (2015). Dependent Multinomial Models Made Easy: Stick-Breaking with the Pólya-gamma Augmentation. In Advances in Neural Information Processing Systems (NIPS) (pp. 3438–3446).
    Paper arXiv Code
  2. Linderman, S. W., & Adams, R. P. (2015). Scalable Bayesian Inference for Excitatory Point Process Networks. ArXiv Preprint ArXiv:1507.03228.
    arXiv Code
  3. Linderman, S. W., Adams, R. P., & Pillow, J. W. (2015). Inferring structured connectivity from spike trains under negative-binomial generalized linear models. Computational and Systems Neuroscience (Cosyne) Abstracts.
  4. Johnson, M. J., Linderman, S. W., Datta, S. R., & Adams, R. P. (2015). Discovering switching autoregressive dynamics in neural spike train recordings. Computational and Systems Neuroscience (Cosyne) Abstracts.

2014

  1. Linderman, S. W., Stock, C. H., & Adams, R. P. (2014). A framework for studying synaptic plasticity with neural spike train data. In Advances in Neural Information Processing Systems (NIPS) (pp. 2330–2338).
    Paper arXiv
  2. Linderman, S. W. (2014). Discovering Latent States of the Hippocampus with Bayesian Hidden Markov Models. CBMM Memo 024: Abstracts of the Brains, Minds, and Machines Summer School.
    Paper
  3. Linderman, S. W., & Adams, R. P. (2014). Discovering Latent Network Structure in Point Process Data. In Proceedings of the International Conference on Machine Learning (ICML) (pp. 1413–1421).
    Paper arXiv Talk Code
  4. Linderman, S. W., Stock, C. H., & Adams, R. P. (2014). A framework for studying synaptic plasticity with neural spike train data. Annual Meeting of the Society for Neuroscience.
    Paper
  5. Nemati, S., Linderman, S. W., & Chen, Z. (2014). A Probabilistic Modeling Approach for Uncovering Neural Population Rotational Dynamics. Computational and Systems Neuroscience (Cosyne) Abstracts.

2013

  1. Linderman, S. W., & Adams, R. P. (2013). Fully-Bayesian Inference of Structured Functional Networks in GLMs. Acquiring and Analyzing the Activity of Large Neural Ensembles Workshop at Neural Information Processing Systems (NIPS).
  2. Linderman, S. W., & Adams, R. P. (2013). Discovering structure in spiking networks. New England Machine Learning Day.
  3. Linderman, S. W., & Adams, R. P. (2013). Inferring functional connectivity with priors on network topology. Computational and Systems Neuroscience (Cosyne) Abstracts.