David will be giving a fly-by view of a number of cool papers from NIPS.

First is Empirical Models of Spiking in Neural Populations by Macke, Büsing, Cunningham, Yu, Shenoy and Mahani, where they evaluate the relative merits of GLMs with pairwise coupling and state space models on multielectrode recording in motor cortex.

Next, Quasi-Newton Methods for Markov Chain Monte Carlo by Zhang and Sutton looks at how to use approximate second-order methods like L-BFGS for MCMC while still preserving detailed balance.

Then, Demixed Principal Component Analysis is an extension of PCA which demixes the dependence of different latent dimensions on different observed parameters, and is used to analyze neural data from PFC

Finally, Learning to Learn with Compound Hierarchical-Deep Models, which combines a deep neural network for learning visual features with a hierarchical nonparametric Bayesian model for learning object categories to make one cool-looking demo.