Approximate Message Passing for Inference in Generalized Linear Models
Generalized approximate message passing (GAMP) methods are a powerful new class of inference algorithms designed for generalized linear models, where an input vector x must be estimated from a noisy, possibly nonlinear function of a transform z = Ax. The methods are based on Gaussian approximations of loopy BP and have the benefit of being computationally extremely simple and general. Moreover, under certain large random transforms, the algorithms are provably Bayes optimal, even in many non-convex problem instances. In this talk, I will provide an overview of GAMP methods, some of the recent extensions to unknown priors and structured uncertainty. I will also highlight some of the main issues in convergence of the algorithm and discuss some applications in neural connectivity detection from calcium imaging.Bio:
Dr. Rangan received the B.A.Sc. at the University of Waterloo, Canada and the M.Sc. and Ph.D. at the University of California, Berkeley, all in Electrical Engineering. He has held postdoctoral appointments at the University of Michigan, Ann Arbor and Bell Labs. In 2000, he co-founded (with four others) Flarion Technologies, a spin off of Bell Labs, that developed Flash OFDM, the first cellular OFDM data system and pre-cursor to many 4G wireless technologies. In 2006, Flarion was acquired by Qualcomm Technologies. Dr. Rangan was a Director of Engineering at Qualcomm involved in OFDM infrastructure products. He joined the ECE department at the NYU Polytechnic School of Engineering in 2010. He is an IEEE Distinguished Lecturer of Vehicular Technology Society. His research interests are in wireless communications, signal processing, information theory and control theory.
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.