Sunday, August 22, 2010

Daniel Soudry : Sept. 1

The neuron as a population of ion channels - 
the emergence of stochastic and history dependent behavior.


The classic view of a neuron as a point element, combining a large number of small synaptic currents, and comparing the sum to a fixed threshold, is becoming more difficult to sustain given the plethora of non-linear regenerative processes known to take place in the soma, axon and even the dendritic tree. Since a common source for the complexity in the input, soma and output is the behavior of ionic channels, we propose a view of a neuron as a population of channels.

Analyzing the stochastic nature of ion channels using recently developed mathematical model, we provide a rather general characterization of the input output relation of the neuron, which admits a surprising level of analytic tractability.

The view developed provides a clear quantitative explanation to history-dependent effects in neurons and of the observed irregularity in firing. Interestingly, the present explanation of firing irregularity does not require a globally balanced state, but, rather, results from the intrinsic properties of a single neuron.

Saturday, August 21, 2010

Yashar Ahmadian : August 25th

Yashar will be continuing where he left off:  Feyman diagrams and other goodies.

Thursday, August 19, 2010

Submit jobs to the HPC cluster from matlab

While we are talking about tools for using the HPC cluster, here's an ad for a tool of my own.

I have been using agricola to submit jobs to the HPC cluster from within matlab.  It is a very simple tool:  Instead of launching a calculation on your local machine by typing in the matlab prompt:

my_result = my_function( some_parameters ) ;

one types:

sow( 'my_result' , @()my_function( some_parameters ) ) ;

This will copy all the .m files in your current directory into a folder on the HPC submit machine, generate a submit file there, and launch the calculation on the cluster. Then some time later, when you suspect the job is done, you type:


which makes the variable  my_result  appear in your matlab workspace.  reap itself returns all the .out, .log, and .err files for you to look at from within matlab.

Unlike Max's code, agricola does not aim to parallelize your code; it just handles sending files back and forth with ssh and job submission.

Tuesday, August 17, 2010

Max Nikitchenko: Lab Meeting, Aug 18, 2010

I will try to cover two topics: multithreading with Matlab on HPC and new numeric methods for the density forward propagation.

In the first part (~15-20min), I'll briefly present Matlab code which allows easy and flexible multithreading for the loops which have independent internal blocks with different values of loop-variables. It should be useful in many computationally expensive optimization problems. The main problem here was to devise a method for locking a JobSubmit file which is used for communication between the main programs and the threads. Unfortunately, I have just discovered that the method I implemented does not give 100% result. At the same time, the code works in most of the cases and simply leads to duplicate computations in the rare situations when the file-locking method failures.

The second part will be on numeric methods for the forward propagation. In recent years a number of articles has been published which focused on the methods for the solution of the Fokker-Planck equation for the associated stochastic integrate-and-fire model. We develop a new method for the numerical estimation of the forward propagation density by computing it via direct quadratic convolution on a dynamic adaptive grid. This method allows us to significantly improve the accuracy of the computations, avoid treating the extreme cases as such and to improve (or, at least, preserve) the speed of the computation in comparison to other methods. We also found that below some value of the time step of the numeric propagation the solution becomes unstable. By considering the density being not centered in the bins centers, but distributed across the bins, we derive a simple condition for the stability of the method. Interestingly, the condition we derive binds linearly the temporal and spatial resolutions - contrary to the well-known Courant stability condition for the Fokker-Planck equation. We further improve the speed of the method by combining it with the fast gauss transform.

Tuesday, August 10, 2010

Yashar Ahmadian : August 11th

Yashar will be presenting preliminary work on applying random matrix theory to the study of transient dynamics in a non-normal linear neural network. 


The project is a collaboration with Ken Miller, and is motivated by his work on non-normal dynamics and transient amplification due to non-normality. I will give a brief background on this work
(see this paper: Balanced amplification: a new mechanism of selective amplification of neural activity patternsby B.K. Murphy and K.D. Miller), and then give an expose of the diagrammatic method for calculating averages over a random (Hermitian N x N) matrix ensemble in the large N limit.

As an example, I will present how to derive the semi-circular law for Gaussian Hermitian matrices. 

Finally, I will discuss how one can extend the method to cover the non-normal case, and I will derive a formula for the spectral density in the large N limit.

Monday, August 9, 2010

Optimal experimental design for sampling voltage on dendritic trees

Here is link to a draft of the paper that came out of my research with Liam this summer:

We are looking for feedback, so if the abstract below piques your interest please take a look at the paper and let us know what you think.

Due to the limitations of current voltage sensing techniques, optimal filtering of noisy, undersampled voltage signals on dendritic trees is a key problem in computational cellular neuroscience. These limitations lead to two sources of difficulty: 1) voltage data is incomplete (in the sense of only capturing a small portion of the full spatiotemporal signal) and 2) these data are available in only limited quantities for a single neuron. In this paper we use a Kalman filtering framework to develop optimal experimental design for voltage sampling. Our approach is to use a simple greedy algorithm with lazy evaluation to minimize the expected mean-square error of the estimated spatiotemporal voltage signal. We take advantage of some particular features of the dendritic filtering problem to efficiently calculate the estimator covariance by approximating it as a low-rank perturbation to the steady-state (zero-SNR) solution. We test our framework with simulations of real dendritic branching structures and compare the quality of both time-invariant and time-varying sampling schemes. The lazy evaluation proved critical to making the optimization tractable. In the time-invariant case improvements ranged from 30-100% over simpler methods, with larger gains for smaller numbers of observations. Allowing for time-dependent sampling produced up to an additional 30% improvement.