Wednesday, November 7, 2012
12:00PM GMCS 405

“How to Solve (Almost) Any Maximum Likelihood Problem Using Markov Chain Monte Carlo (MCMC)


MCMC sampling has allowed for simulation from complex, massively multivariate distributions even when that distribution is only specified up to an unknown normalizing constant. This talk presents the theory behind maximum likelihood parameter estimation using simulation (MCMC-MLE), and little known but vitally important implementation issues and heuristics focusing on exponential families of distributions. In particular, a useful approximation to the normalizing constant in the exponential family likelihood is presented which increases the stability and accuracy of the MCMC-MLE algorithm. We also show how MCMC standard errors can be used as a measure of when to trust this approximation. Finally, simple examples are used to illustrate how this algorithm can be used to perform inference on a new class of models for social networks dubbed Exponential-Family Random Network Models (ERNM).