[5] Pakman, A.*, Huggins, J.* & Paninski, L. (Under review). Fast penalized state-space methods for inferring dendritic synaptic connectivity. (*contributed equally)

[4] L. Paninski, K. Rahnama Rad, & J. Huggins (Under review). Fast low-SNR Kalman filtering, with applications to high-dimensional smoothing. [pdf]

[3] J. Huggins & L. Paninski (2011). Optimal experimental design for sampling voltage on dendritic trees in the low-SNR regime. In Press, Journal of Computational Neuroscience. [pdf]

[2] M. Vilain, J. Huggins, & B. Wellner (2009). Sources of performance in CRF transfer training: a business name-tagging case study. Recent Advances in Natural Language Processing 2009. [pdf]

[1] M. Vilain, J. Huggins, & B. Wellner (2009). A simple feature-copying approach to long-distance dependencies. Proc. of the 13th Conference on Computational Natural Language Learning 2009. [pdf]

Expository Papers, etc.

In Fall 2011, I wrote a final paper called “Provably Learning Mixtures of Gaussians and More” for Rocco Servedio’s class on computational learning theory.

Also in Fall 2011, for a final project in Stephen Edwards’ compilers class, David Hu, Hans Hyttinen, Harley McGrew and I created YAPPL (Yet Another Probabilistic Programming Language). The final report includes a short tutorial and the language reference manual. The code for the compiler is written in OCaml, which happens to be one of my favorite programming languages.

© 2011 Jonathan Huggins | Generated by webgen | Design by Andreas Viklund.