scott.linderman@columbia.edu

Scott W. Linderman

Postdoctoral Fellow, Columbia University

I'm a postdoctoral fellow in the labs of David Blei and Liam Paninski at Columbia University. I completed my Ph.D. in Computer Science at Harvard University with Ryan Adams and Leslie Valiant. My research is focused on machine learning, computational neuroscience, and the general question of how computational and statistical methods can help us decipher neural computation. In June 2019, I'll be joining Stanford University as an assistant professor in the Statistics Department and the Stanford Neurosciences Institute

Here are a few highlights from my recent work.

Recurrent Switching Linear Dynamical Systems

Bayesian learning and inference for models with co-evolving discrete and continuous latent states.

With Matt Johnson, Andy Miller, Ryan Adams, David Blei, and Liam Paninski. AISTATS, 2017.

Variational Sequential Monte Carlo

We view SMC as a variational family indexed by the parameters of its proposal distribution and show how this generalizes the importance weighted autoencoder. As the number of particles goes to infinity, the variational approximation approaches the true posterior.

With Christian Naesseth, Rajesh Ranganath, and David Blei. AISTATS, 2018.

Rejection Sampling Variational Inference

Reparameterization gradients through rejection samplers for automatic variational inference in models with gamma, beta, and Dirichlet latent variables.

Received the Best Paper Award at AISTATS 2017!

With Christian Naesseth, Fran Ruiz, and David Blei.

Studying Synaptic Plasticity with Time-Varying GLMs

We propose a time-varying generalized linear model whose weights evolve according to synaptic plasticity rules, and we perform Bayesian inference with particle MCMC.

With Chris Stock and Ryan Adams. NIPS, 2014.

Publications

2018

  1. Sharma, A., Johnson, R. E., Engert, F., & Linderman, S. W. (2018). Point process latent variable models of freely swimming larval zebrafish. Advances in Neural Information Processing Systems (NIPS).
  2. Markowitz, J. E., Gillis, W. F., Beron, C. C., Neufeld, S. Q., Robertson, K., Bhagat, N. D., … Datta, S. R. (2018). The Striatum Organizes 3D Behavior via Moment-to-Moment Action Selection. Cell. https://doi.org/doi: 10.1016/j.cell.2018.04.019
    Paper
  3. Linderman, S. W., Nichols, A., Blei, D. M., Zimmer, M., & Paninski, L. (2018). Hierarchical recurrent models reveal latent states of neural activity in C. elegans. Computational and Systems Neuroscience (Cosyne) Abstracts.
  4. Markowitz, J. E., Gillis, W. F., Beron, C. C., Neufeld, S. Q., Robertson, K., Bhagat, N. D., … Datta, S. R. (2018). Complementary Direct and Indirect Pathway Activity Encodes Sub-Second 3D Pose Dynamics in Striatum. Computational and Systems Neuroscience (Cosyne) Abstracts.
  5. Johnson*, R. E., Linderman*, S. W., Panier, T., Wee, C., Song, E., Miller, A. C., & Engert, F. (2018). Revealing multiple timescales of structure in larval zebrafish behavior. Computational and Systems Neuroscience (Cosyne) Abstracts.
  6. Mena, G. E., Belanger, D., Linderman, S. W., & Snoek, J. (2018). Learning Latent Permutations with Gumbel-Sinkhorn Networks. International Conference on Learning Representations.
    Paper Code
  7. Linderman, S. W., Mena, G. E., Cooper, H., Paninski, L., & Cunningham, J. P. (2018). Reparameterizing the Birkhoff Polytope for Variational Permutation Inference. In Proceedings of the 21st International Conference on Artificial Intelligence and Statistics (AISTATS).
    arXiv
  8. Naesseth, C. A., Linderman, S. W., Ranganath, R., & Blei, D. M. (2018). Variational Sequential Monte Carlo. In Proceedings of the 21st International Conference on Artificial Intelligence and Statistics (AISTATS).
    arXiv Code

2017

  1. Linderman, S. W., Wang, Y., & Blei, D. M. (2017). Bayesian inference for latent Hawkes processes. Advances in Approximate Bayesian Inference Workshop at the 31st Conference on Neural Information Processing Systems.
    Paper
  2. Buchanan, E. K., Lipschitz, A., Linderman, S. W., & Paninski, L. (2017). Quantifying the behavioral dynamics of C. elegans with autoregressive hidden Markov models. Workshop on Worm’s Neural Information Processing at the 31st Conference on Neural Information Processing Systems.
    Paper
  3. Mena, G. E., Linderman, S. W., Belanger, D., Snoek, J., Cunningham, J. P., & Paninski, L. (2017). Toward Bayesian permutation inference for identifying neurons in C. elegans. Workshop on Worm’s Neural Information Processing at the 31st Conference on Neural Information Processing Systems.
    Paper
  4. Linderman, S. W., & Johnson, M. J. (2017). Structure-Exploiting Variational Inference for Recurrent Switching Linear Dynamical Systems. IEEE International Workshop on Computational Advances in Multi-Sensor Adaptive Processing.
    Paper
  5. Linderman, S. W., & Blei, D. M. (2017). Comment: A Discussion of “Nonparametric Bayes Modeling of Populations of Networks.” Journal of the American Statistical Association, 112(520), 1543–1547.
    Paper Code
  6. Linderman, S. W., & Gershman, S. J. (2017). Using computational theory to constrain statistical models of neural data. Current Opinion in Neurobiology, 46, 14–24.
    Paper bioRxiv Code
  7. Linderman, S. W., Miller, A. C., Adams, R. P., Blei, D. M., Johnson, M. J., & Paninski, L. (2017). Neuro-behavioral Analysis with Recurrent switching linear dynamical systems. Computational and Systems Neuroscience (Cosyne) Abstracts.
  8. Linderman*, S. W., Johnson*, M. J., Miller, A. C., Adams, R. P., Blei, D. M., & Paninski, L. (2017). Bayesian learning and inference in recurrent switching linear dynamical systems. In Proceedings of the 20th International Conference on Artificial Intelligence and Statistics (AISTATS).
    Paper Slides Talk Code
  9. Naesseth, C. A., Ruiz, F. J. R., Linderman, S. W., & Blei, D. M. (2017). Reparameterization Gradients through Acceptance-Rejection Sampling Algorithms. In Proceedings of the 20th International Conference on Artificial Intelligence and Statistics (AISTATS). Best Paper Award.
    Paper Blog Post Code

2016

  1. Naesseth, C. A., Ruiz, F. J. R., Linderman, S. W., & Blei, D. M. (2016). Rejection sampling variational inference. Advances in Approximate Bayesian Inference Workshop at the 30th Conference on Neural Information Processing Systems.
    Paper
  2. Chen, Z., Linderman, S. W., & Wilson, M. A. (2016). Bayesian Nonparametric Methods For Discovering Latent Structures Of Rat Hippocampal Ensemble Spikes. IEEE Workshop on Machine Learning for Signal Processing.
    Paper Code
  3. Elibol, H. M., Nguyen, V., Linderman, S. W., Johnson, M. J., Hashmi, A., & Doshi-Velez, F. (2016). Cross-Corpora Unsupervised Learning of Trajectories in Autism Spectrum Disorders. Journal of Machine Learning Research, 17(133), 1–38.
    Paper
  4. Linderman, S. W., Adams, R. P., & Pillow, J. W. (2016). Bayesian latent structure discovery from multi-neuron recordings. In Advances in Neural Information Processing Systems (NIPS).
    Paper Code
  5. Linderman, S. W. (2016). Bayesian methods for discovering structure in neural spike trains (PhD thesis). Harvard University. Leonard J. Savage Award for Outstanding Dissertation in Applied Bayesian Methodology from the International Society for Bayesian Analysis
    Thesis Code
  6. Linderman, S. W., Johnson, M. J., Wilson, M. A., & Chen, Z. (2016). A Bayesian nonparametric approach to uncovering rat hippocampal population codes during spatial navigation. Journal of Neuroscience Methods, 263, 36–47.
    Paper Code
  7. Linderman, S. W., Tucker, A., & Johnson, M. J. (2016). Bayesian Latent State Space Models of Neural Activity. Computational and Systems Neuroscience (Cosyne) Abstracts.

2015

  1. Linderman*, S. W., Johnson*, M. J., & Adams, R. P. (2015). Dependent Multinomial Models Made Easy: Stick-Breaking with the Pólya-gamma Augmentation. In Advances in Neural Information Processing Systems (NIPS) (pp. 3438–3446).
    Paper arXiv Code
  2. Linderman, S. W., & Adams, R. P. (2015). Scalable Bayesian Inference for Excitatory Point Process Networks. ArXiv Preprint ArXiv:1507.03228.
    arXiv Code
  3. Linderman, S. W., Adams, R. P., & Pillow, J. W. (2015). Inferring structured connectivity from spike trains under negative-binomial generalized linear models. Computational and Systems Neuroscience (Cosyne) Abstracts.
  4. Johnson, M. J., Linderman, S. W., Datta, S. R., & Adams, R. P. (2015). Discovering switching autoregressive dynamics in neural spike train recordings. Computational and Systems Neuroscience (Cosyne) Abstracts.

2014

  1. Linderman, S. W., Stock, C. H., & Adams, R. P. (2014). A framework for studying synaptic plasticity with neural spike train data. In Advances in Neural Information Processing Systems (NIPS) (pp. 2330–2338).
    Paper arXiv
  2. Linderman, S. W. (2014). Discovering Latent States of the Hippocampus with Bayesian Hidden Markov Models. CBMM Memo 024: Abstracts of the Brains, Minds, and Machines Summer School.
    Paper
  3. Linderman, S. W., & Adams, R. P. (2014). Discovering Latent Network Structure in Point Process Data. In Proceedings of the International Conference on Machine Learning (ICML) (pp. 1413–1421).
    Paper arXiv Talk Code
  4. Linderman, S. W., Stock, C. H., & Adams, R. P. (2014). A framework for studying synaptic plasticity with neural spike train data. Annual Meeting of the Society for Neuroscience.
    Paper
  5. Nemati, S., Linderman, S. W., & Chen, Z. (2014). A Probabilistic Modeling Approach for Uncovering Neural Population Rotational Dynamics. Computational and Systems Neuroscience (Cosyne) Abstracts.

2013

  1. Linderman, S. W., & Adams, R. P. (2013). Fully-Bayesian Inference of Structured Functional Networks in GLMs. Acquiring and Analyzing the Activity of Large Neural Ensembles Workshop at Neural Information Processing Systems (NIPS).
  2. Linderman, S. W., & Adams, R. P. (2013). Discovering structure in spiking networks. New England Machine Learning Day.
  3. Linderman, S. W., & Adams, R. P. (2013). Inferring functional connectivity with priors on network topology. Computational and Systems Neuroscience (Cosyne) Abstracts.