Monday, May 30, 2011

Eric Shea-Brown : May 31st

Eric Shea-Brown who has come all the way from U of Washington will be speaking about:

A mechanistic approach to multi-spike patterns in neural circuits:
There is a combinatorial explosion in the number of possible activity patterns in neural circuits of increasing size, enabling an enormous complexity in which patterns occur and how this depends on incoming stimuli.  However, recent experiments show that this complexity is not always accessed -- the activity of many neural populations is remarkably well captured by simpler descriptions that rely only on the activity of single neurons and neuron pairs.  

What is especially intriguing is that these pairwise descriptions succeed even in cases where circuit architecture seems likely to create a far more complex set of outputs.  We seek a mechanistic understanding of this phenomenon -- and predictions for when it will break down -- based on simple models of spike generation, circuit connectivity, and stimuli.  This also offers a chance to explore how much (and how little) beyond-pairwise spike patterns can matter to coding in different circuits.

As a specific application, we consider the empirical success of pairwise models in capturing the activity of ON-parasol retinal ganglion cells. We first use intracellular recordings to fully constrain a model of the underlying circuit dynamics.  Our theory then provides an explanation for experimental findings based on ON-parasol stimulus filtering and spike generation properties.  

This is joint work with Andrea Barreiro, Julijana Gjorgjieva, and Fred Rieke.

Monday, May 9, 2011

Max Nikitchenko: May 10

This Tuesday, on 2011/05/10, I will discuss methods for the acceleration of the convergence of algorithms with linear convergence near the fixed point, such as EM, which are known to be notoriously slow in that area. Two approaches are possible: modify the iterative algorithm itself (PX-EM (by parameter-expansion), ECM (by maximizing the maximizer individually for each parameter, keeping the others fixed), etc), or use the recent history of the iterations to extrapolate them closer to the fixed point (in which case you keep all your machinery intact and only plug in an auxiliary function for extrapolating the already computed iteration steps). I will talk about the second class of the accelerators.

I will start with the method I derived myself, which is visual, but powerful at the same time. I will then focus on two papers which seem to become the gold standard in the acceleration techniques: Varadhan, R. & Roland, C. "Simple and Globally Convergent Methods for Accelerating the Convergence of Any EM Algorithm" (dx.doi.org/10.1111/j.1467-9469.2007.00585.x) from 2008 and Zhou, H.; Alexander, D. & Lange, K. "A quasi-Newton acceleration for high-dimensional optimization algorithms" (dx.doi.org/10.1007/s11222-009-9166-3) from 2011. I have just found out about the second paper and it seems to overlap heavily with the method I derived. I hope we will clear this question up!

Sunday, May 1, 2011

class presentations may 3 at 3:00

hi all - this tuesday we won't have normal group meeting. instead, the
students in my class will be giving presentations about the projects they
have been working on this semester. talk titles are here:
http://www.stat.columbia.edu/~liam/teaching/neurostat-spr11/talks.txt

presentations will begin at 3, and each one should last 15 min or so.
everyone's welcome to attend - hope to see you there.
L