I am a computational neuroscientist interested in developing experimentally motivated, mathematically grounded theories of neural systems. I am currently a PhD student in the Center for Theoretical Neuroscience at Columbia University.

This is my old academic website. My current website can be found at jacobfulano.github.io

About

Before the PhD I was a master's student in Philosophical Foundations of Physics at Columbia University, and I worked as a research assistant in the neuroscience/biomedical engineering lab of Professor Elizabeth Hillman.

I was lucky enough to have studied physics (B.S.) and philosophy (B.A.) at Stanford University (CV, LinkedIn)

I have included some snippets of my work in neuroscience and quantum computing below, including my master's thesis. Feel free to contact me at jpp2139 [at] columbia.edu

Neuroscience

My old work in the Laboratory for Optical Imaging (LFOI) of Professor Elizabeth Hillman involves modeling hemodynamic (blood flow) responses to stimulus in rat brains. We worked on a mathematical model that has implications for some interpretations of fMRI data.

Here is the abstract to the poster that I presented at the Society for Neuroscience (SfN) conference in 2014:

Functional magnetic resonance imaging (fMRI) relies upon measuring decreases in the concentration of deoxy-hemoglobin caused by increases in local blood flow. The process that links local neuronal activity to changes in local blood flow in the brain is termed neurovascular coupling. A number of different cellular mechanisms have been proposed to play a role in functional neurovascular coupling, and a better understanding of these mechanisms should permit improved interpretation of the fMRI blood oxygen level dependent (BOLD) signal. However, until now, most mathematical models of the fMRI BOLD response have been based on simplifying assumptions, and have not been directly linked to specific cellular mechanisms. In their simplest form, existing fMRI models predict a linear response between neuronal activity and the resultant hemodynamic response, however numerous studies have demonstrated conditions in which hemodynamic responses are non-linear.
In this work, (1) we propose a new model of functional hyperemia in the brain based on a recently proposed cellular mechanism for neurovascular coupling that includes endothelial propagation of vasodilation. (2) We demonstrate that the properties of this newly recognized component of neurovascular coupling can predict both the spatial and temporal evolution of the hemodynamic response to somatosensory stimuli for a range of different stimulus durations, accurately predicting the response’s non-linear properties. We cross-validate our model using high spatiotemporal resolution optical imaging data acquired on the exposed rat cortex. Our results provide a new basis for the observed form and nonlinearities of functional hyperemia in the brain, while also providing strong evidence for the importance of endothelial propagation of vasodilation in neurovascular coupling.

Quantum Computing

Here is a link to a working final draft my M.A. thesis Decoherence, Superconducting Qubits, and the Possibility of Quantum Computing. My advisor was Professor Allan Blaer in the Columbia Physics department.

The abstract:

Is it possible to implement a fully controllable, unambiguously quantum computer? While most in the field believe that the answer is in the affirmative, uncertainty and skepticism still exist among academics and industry professionals. In particular, decoherence is often spoken of as an insurmountable challenge. This thesis argues that there are no fundamental mathematical or physical properties that would preclude the possibility of implementing a fully controllable quantum computer using superconducting qubits. The proof is in key results from the past 30 years in math, physics and computer science; this thesis is a sketch of these results. It begins with the well known theoretical results that have motivated the field - namely quantum algorithmic speed up and efficient error correction - and continues with an overview of the well developed theory of decoherence, arguing that decoherence has been and can still be significantly reduced. These theoretical results are related to superconducting qubits throughout. The thesis concludes with a summary of recent experimental progress with superconducting qubit circuits.

Here is a link to the first page of a short paper I wrote on quantum chemistry algorithms for quantum computing for Professor Anargyros Papageorgiou in the Computer Science department Ab Initio Quantum Chemistry Methods, Computational Complexity, and Quantum Computation

While much of the original motivation for quantum computation came from computationally difficult problems in chemistry and physics, most of the theoretical work of the past 30 years has focused on specialized algorithms in discrete algebra (for example, quantum factoring and quantum search algorithms). More recently, however, there has been a growing interest in quantum numerical methods for the simulation of physical systems such as multi-electron atoms and molecules. Many of these types of problems classically require exponential execution time relative to the number of interacting particles; quantum mechanically however, it has been shown that some of these problems can be executed in polynomial time. Although these results are somewhat encouraging, they are certainly not panaceas to the many issues that plague computational quantum chemistry. This paper attempts to briefly outline a smattering of both classic-computational and quantum-computational algorithms for chemical simulation with a focus on theoretical motivation, convergence, and cost.
We first introduce the traditional Self Consistent Field (SCF), the Configuration Interaction (CI), and the Multi-Configuration Self-Consistent Field (MCSCF) methods, including second quantization notation where relevant. We emphasize that while the best SCF procedures have third-order polynomial cost, and the best CI procedures have exponential cost, CI is often desired due to better convergence and accuracy. This explains why recent attempts to simulate electronic structure Hamiltonians using quantum computers focus on CI and not SCF. In the second part of this paper, we explain how polynomial cost CI is cleverly achieved by using quantum phase estimation, in recent efforts by Lloyd, Aspuru-Guzik, Whitfield, and others. We tentatively conclude, however, that the various (seemingly) necessary physical assumptions at the foundations of the abovementioned methods still make third and fourth order polynomial cost prohibitively inefficient for all but the smallest multi-atomic and multi-molecular systems, and that convergence issues stemming from these assumptions only add insult to the injury.

Philosophy of Physics

Over the years I have written papers on various philosophical issues in physics. Here is a sample of some of these musings:

{Space and Time}

This first paper is on the early debate between Ernst Cassirer and Moritz Schlick on the interpretation of general relativity: Logical Idealism & Einstein’s Theory of Relativity: Schlick’s Critique of Cassirer’s Monograph Zur Einsteinschen Relativitätstheorie (1921).

When Einstein’s general theory of relativity was formulated during the second decade of the 20th century, philosophers of both the neo-Kantian and logical empiricist variety scrambled to fit the revolutionary theory into their respective philosophical frameworks. A few even went so far as to claim that the new theory was an unambiguous confirmation of their particular philosophy of science. One of the first serious arguments of this kind was published in a 1921 monograph by the neo-Kantian Ernst Cassirer entitled Zur Einsteinschen Relativitätstheorie (“Einstein’s Theory of Relativity”). This book was also the subject of a decisive critique, published soon afterwards by the champion of logical empiricism Moritz Schlick, which dismissed it as a worthy but ultimately unsuccessful attempt to incorporate the new relativity into Kantian epistemology. Interestingly, some write that this article was in many ways responsible for the rise of logical empiricism and the precipitous decline of neo-Kantian thought among philosophers of science (logical idealism); that the debate between neo-Kantian and empiricist philosophy over relativity theory “effectively ended with Schlick’s essay” and that the article “may well be regarded as the point of departure of a new direction for scientific philosophy [i.e. logical empiricism].”
A few scholars have noted, however, that Schlick’s argument does not address Cassirer’s neo-Kantian epistemology and instead attacks the straw man of traditional Kantian epistemology – an easy target indeed, considering that Kant relied heavily on “outdated” ideas from Euclidean geometry and Newtonian mechanics. More specifically, Schlick argues that the new theory of relativity abolishes all old notions of the synthetic a priori and that Cassirer fails to specify new notions of the synthetic a priori in the new theory. However, he defines synthetic a priori in a traditional Kantian manner, and essentially dismisses Cassirer’s widely accepted neo-Kantian (Marburg school) understanding of the synthetic a priori outlined in his book and in previous publications as “transcending the region of critical [i.e. Kantian] philosophy proper.” This paper attempts to systematically argue that (1) Cassirer does indeed fail to incorporate Einstein’s theory of relativity into the traditional Kantian notion of constitutive synthetic a priori (as argued by Schlick), but that (2) Cassirer successfully incorporates Einstein’s theory of relativity into his neo-Kantian notion of regulative synthetic a priori.
This paper tentatively concludes that the oft-stated claim that Einstein’s theory of general relativity decisively refutes neo-Kantian philosophy of science in favor logical empiricism is, to a large extent, unwarranted.

{Quantum Computation and Many Worlds}

Here is a link to an informal paper I wrote criticizing the supposed relationship between quantum computing and the Many Worlds Interpretation (MWI) of quantum mechanics Will Advances in Quantum Computing Shed Light on Foundational Issues in Quantum Mechanics?

While taking a course on philosophical issues in quantum mechanics, I came across a 1986 article by David Deutsch called Three experimental implications of the Everett interpretation. On the first page, Deutsch writes:

“It has come to my attention that there are still some conference participants who harbor residual doubts about the Everett interpretation of quantum theory. I thought it might be helpful to leave aside for a moment the theoretical arguments pro and contra (such as they are) and look at a seldom mentioned but important aspect of Everett’s interpretation, namely its connection with experiment.” (p.215)

I was intrigued by this claim, especially after having been taught that Everett’s interpretation wasn’t experimentally verifiable. In slight disbelief, I jumped to the end of the paper and came across these passages:

“Experiment 3 is nothing less than a direct, unambiguous experimental test of the Everett quantum theory against any theory that has the wave function collapsing.” (p. 216)

There was no doubt that Deutsch was promising experiments. Could quantum computers really hold the key to the hidden mysteries of quantum mechanics? I resolved to understand what he had to say, and evaluate his claims myself.

{Poincare and Einstein on Time}

I wrote a short essay on a debate between Poincare and Einstein on the nature of time here

The physicist-cum-philosopher Henri Poincaré and the all-too-well-known physicist Albert Einstein both dealt with the scientific characterization of time at a moment in history when new results were undermining well-established scientific theories. While Poincaré introduced some interesting concepts to the debate with his paper The Measure of Time (1898), it was really Einstein who shook the scientific establishment in his paper The Theory of Relativity (1911). This essay compares these two important papers and argues that the real difference between Poincaré and Einstein lies in their fundamentally different approaches.

{Metric Field Substantivilism}

Here is a short paper on substantivilism (the framework/belief believe that spacetime and its parts are fundamental constituents of reality):

Ever since its formulation in 1973, the “metric field substantivalist” characterization of space-time has been held in high regard by both physicists and philosophers alike. One philosophically significant aspect of this characterization is that while it taxonomically belongs to the substantivilist camp, there are strong reasons to relegate this contemporary view of space-time to the relationist camp, or even to a third category of its own. This paper introduces metric field substantivilism within the context of traditional substativilism and relationism, weighs various arguments for and against various modes of categorization, and ultimately concludes that the traditional dichotomous categories are not actually meaningful.

{Duhem-Quine Thesis}

This is a short paper on a connection between Willard van Orman Quine and Pierre Duhem

In his seminal 1951 paper “Two Dogmas of Empiricism,” the celebrated American logician-cum-philosopher W. V. Quine argues that “our statements about the external world face the tribunal of sense experience not individually, but only as a corporate body” (Quine 355). As support, he cites the influential argument titled “An Experiment in Physics Can Never Condemn an Isolated Hypothesis but Only a Whole Theoretical Group” from P. Duhem’s classic 1906 book “The Aim and Structure of Physical Theory” (Duhem 183). At first glance, the two arguments don’t seem to be related, although upon closer inspection it is clear the Quine’s claim is closely linked to Duhem’s thesis. This of course begs the following question: how can Quine use Duhem’s philosophy - which explicitly assumes empirical (synthetic) truths - in a paper that viciously attacks the synthetic/analytic and reductionist dogmas of modern empiricism? And if it is then the case that Quine’s understanding of Duhem’s argument is justified, how can Duhem himself simultaneously believe in empirical truth and the underdetermination of physical theory by scientific experiment?

{Reductio ad Sensibilem}

Here is a fun little paper on the laws of nature =)

The idea that there exist laws of nature is hardly revolutionary. Most physicists would agree that the conservation of energy, the speed of light in a vacuum, and the charges of fundamental particles are all laws of nature. But the belief that the laws of nature can be explained – for example, that there is a reason electrons have a charge of -1e – is much more contentious. Can laws of nature really have reasons? And while a whole slew of natural laws might be “explained” by more fundamental natural laws, can those fundamental natural laws be explained? The purpose of this paper is to argue that the laws of nature as defined by popular models of lawhood are indeed explainable according to the process of reductio ad sensibilem.