Sunday, April 27, 2014

Evan Archer: April 29th

Bayesian nonparametric methods for entropy estimation in spike data

Shannon’s entropy is a basic quantity in information theory, and a useful tool for the analysis of neural codes. However, estimating entropy from data is a difficult statistical problem. In this talk, I will discuss the problem of estimating entropy in the “under-sampled regime”, where the number of samples is small relative to the number of symbols. Dirichlet and Pitman-Yor processes provide tractable priors over countably-infinite discrete distributions, and have found applications in Bayesian non-parametric statistics and machine learning. In this talk, I will show that they also provide natural priors for Bayesian entropy estimation. These nonparametric priors permit us to address two major issues with previously-proposed Bayesian entropy estimators: their dependence on knowledge of the total number of symbols, and their inability to account for the heavy-tailed distributions which abound in biological and other natural data. What’s more, by “centering” a Dirichlet Process over a flexible parametric model, we are able to develop Bayesian estimators for the entropy of binary spike trains using priors designed to flexibly exploit the statistical structure of simultaneously-recorded spike responses. Finally, in applications to simulated and real neural data, I'll show that these estimators perform well in comparison to traditional methods.

Thursday, April 24, 2014

Maurizio Filippone: 30th April

Pseudo-Marginal Bayesian Inference for Gaussian Processes

Statistical models where parameters have a hierarchical structure are commonly employed to flexibly model complex phenomena and to gain some insight into the functioning of the system under study.
Carrying out exact parameter inference for such models, which is key to achieve a sound quantification of uncertainty in parameter estimates and predictions, usually poses a number of computational challenges. In this talk, I will focus on Markov chain Monte Carlo (MCMC) based inference for hierarchical models involving Gaussian Process (GP) priors and non-Gaussian likelihood functions.
After discussing why MCMC is the only way to infer parameters "exactly" in general GP models and pointing out the challenges in doing so, I will present a practical and efficient alternative to popular MCMC reparameterization techniques based on the so called Pseudo-Marginal MCMC approach.
In particular, the Pseudo-Marginal MCMC approach yields samples from the exact posterior distribution over GP covariance parameters, but only requires an unbiased estimate of the analytically intractable marginal likelihood. Finally, I will present ways to construct unbiased estimates of the marginal likelihood in GP models, and conclude the talk by presenting results on several benchmark data and on a multi-class multiple-kernel classification problem with neuroimaging data.


Useful links

http://www.dcs.gla.ac.uk/~maurizio/index.html
http://arxiv.org/abs/1310.0740
http://arxiv.org/abs/1311.7320
http://www.dcs.gla.ac.uk/~maurizio/Publications/aoas12.pdf
http://www.dcs.gla.ac.uk/~maurizio/Publications/ml13.pdf

Saturday, April 19, 2014

Patrick J. Wolfe: April 23rd

Nonparametric estimation of network structure

Networks are a key conceptual tool for analysis of rich data structures, yielding meaningful summaries in the biological as well as other sciences.  As datasets become larger, however, the interpretation of network-based summaries becomes more challenging.  A natural next step in this context is to think of modeling a network nonparametrically -- and here we will show how such an approach is possible, both in theory and in practice.  As with a histogram, nonparametric models can fully represent variation in a network, without presupposing a particular set of motifs or other distributional forms.  Advantages and limitations of the approach will be discussed, along with open problems at the methodological frontier of statistical network analysis.  Joint work with David Choi (http://arxiv.org/abs/1212.4093) and Sofia Olhede (http://arxiv.org/abs/1309.5936/, http://arxiv.org/abs/1312.5306/).

Friday, April 11, 2014

Uygar Sümbül: April 16th

Submicron precision in the retina: classifying the cell types of the brain

The importance of cell types in understanding brain function is widely appreciated but only a tiny fraction of neuronal diversity has been catalogued. Here, we exploit recent progress in genetic definition of cell types in an objective structural approach to neuronal classification. The approach is based on highly accurate quantification of dendritic arbor position relative to neurites of other cells. We test the method on a population of 363 mouse retinal ganglion cells. For each cell, we determine the spatial distribution of the dendritic arbors, or "arbor density" with reference to arbors of an abundant, well-defined interneuronal type. The arbor densities are sorted into a number of clusters that is set by comparison with several molecularly defined cell types. The algorithm reproduces the genetic classes that are pure types, and detects six newly clustered cell types that await genetic definition.

Thursday, April 3, 2014

Rishidev Chaudhuri: April 9th

Timescales and the large-scale organization of cortical dynamics

In the first part of this talk I will present results from a model of 29 interacting areas in the macaque cortex. We built this model by combining quantitative data on long-range projections between cortical areas with an estimate of the strength of excitatory connections within an area. These anatomical constraints naturally give rise to a hierarchy of timescales in network activity: early sensory areas respond in a moment-to-moment fashion, allowing them to track a changing environment, while cognitive areas show long timescales, potentially providing the substrate for information integration and flexible decision-making. We characterize the dependence of this hierarchy on local and long-range anatomical properties and show the existence of multiple dynamical hierarchies subserved by the same anatomical structure. The model thus demonstrates how variations in anatomical properties across the cortex can produce dynamical and functional specialization in timescales of response.

I will then describe a network model for the temporal structure of human ECoG dynamics. We find the power spectra of ECoG recordings are well-described by the output of a randomly-connected linear dynamical network with net excitatory interactions between nodes. The architecture predicts that slow fluctuations show long-range spatial correlations and that decorrelation of inputs to a network could account for observed changes in ECoG power spectra upon task initiation. It also predicts that networks with strongly local connectivity should produce power spectra that show "1/f" behavior at low-frequencies. This analysis provides mechanistic insight into emergent network dynamics, links observed changes in power spectra to particular reconfigurations of the network and could help characterize differences between cortical regions, states and subjects.