The statistical literature on causal inference is based on notation that expresses the idea that a causal relationship sustains a counterfactual conditional (e.g, to say that taking the pill caused John to get better means John took the pill and got better and that had he not taken the pill, he would not have gotten better). Using this notation, causal estimands are defined and methods used to estimate these are evaluated for bias.
This talk is to introduce you to this notation and literature and to point to some issues such as mediation and interference that have been addressed (at least somewhat) in the literature that may be of interest and relevance to neuroscience.
We meet on Wednesdays at 1pm, in the 10th floor conference room of the Statistics Department, 1255 Amsterdam Ave, New York, NY.
Tuesday, December 18, 2012
Tuesday, December 4, 2012
Eftychios Pnevmatikakis: Dec 5
Tomorrow at 1PM I'm going to present some overview of the recent
work on approximate message passing algorithms (AMP) with applications
to compressed sensing (CS).
I'm going to start with a brief
overview of message passing algorithms [1] and then show how it was used
in [2] to derive an AMP algorithm for the standard CS setup (basis
pursuit, lasso).
The time permitting I'm going to briefly
present some extensions of this methodology to the case of more general
graphical models [3].
Material will be drawn from the following sources:
[1] Kschischang, Frank R., Brendan J. Frey, and H-A. Loeliger. "Factor graphs and the sum-product algorithm." Information Theory, IEEE Transactions on 47.2 (2001): 498-519.
[2] Donoho, David L., Arian Maleki, and Andrea Montanari. "Message-passing algorithms for compressed sensing." Proceedings of the National Academy of Sciences 106.45 (2009): 18914-18919.
[3] Rangan, Sundeep, et al. "Hybrid approximate message passing with applications to structured sparsity." arXiv preprint arXiv:1111.2581 (2011).
Subscribe to:
Posts (Atom)