Tuesday, February 26, 2013

David Blei: Feb 27th

Stochastic Variational Inference

Abstract:  We develop stochastic variational inference, a scalable algorithm for approximating posterior distributions. We develop this technique for a large class of probabilistic models and we demonstrate it with two probabilistic topic models, latent Dirichlet allocation and the hierarchical Dirichlet process topic model. Using stochastic variational inference, we analyze several large collections of documents: 300K articles from Nature, 1.8M articles from The New York Times, and 3.8M articles from Wikipedia. Stochastic inference can handle the full data, and outperforms traditional variational inference on a subset. (Further, we show that the Bayesian nonparametric topic model outperforms its parametric counterpart.) Stochastic variational inference lets us apply complex Bayesian models to very large data sets.

You can read the paper here.

Wednesday, February 20, 2013

Carl Smith and Ari Pakman: Feb. 20

We review and present new results on spike-and-slab priors to impose sparsity in regression problems.


- Why spike-and-slab?
- Variational Bayes approximation to the posterior
- Computing hyperparameters using Empirical Bayes.
- Singular and non-singular Markov Chains for MCMC.
- Gibbs sampler for the posterior sparsity variables.
- Extension to regression with positive coefficients.
- Example application: finding synaptic weights in a dendritic tree

Friday, February 8, 2013

Garud Iyengar: Feb 13

Title: Fast first-order augmented Lagrangian algorithms for sparse optimization problems

In this talk we will survey recent work on fast first-order algorithms for solving optimization problems with non-trivial conic constraints. These algorithms are augmented Lagrangian algorithms; however, unlike traditional augmented Lagrangian algorithms we update the penalty multiplier during the course of the algorithm. The algorithm iterates are epsilon-feasible and epsilon-optimal in O(log(1/epsilon))-multiplier update steps with an overal complexity of O(1/epsilon). We will discuss the key steps in the algorithm development and show numerical results for basis pursuit, principal component pursuit and stable principal component pursuit.

Joint work with N. Serhat Aybat (Penn State)