STA2311H: Advanced Computational Methods for Statistics I

This course is part one of a two-course sequence that introduces graduate students to computational methods designed specifically for statistical inference. This course will cover methods for optimization and simulation methods in several contexts. Optimization methods are introduced in order to conduct likelihood-based inference, while simulation techniques are used for studying the performance of a given statistical model and to conduct Bayesian analysis. Covered topics include gradient-based optimization algorithms (Newton method, Fisher scoring), the Expectation-Maximization (EM) algorithm and its variants (ECM, MCEM, etc), basic simulation principles and techniques for model analysis (cross-validation independent replications, etc.), Monte Carlo and Markov chain Monte Carlo algorithms (accept-reject, importance sampling Metropolis-Hastings and Gibbs samplers, adaptive MCMC, Approximate Bayesian computation, consensus Monte Carlo, subsampling MCMC, etc.). Particular emphasis will be placed on modern developments that address situations in which the Bayesian analysis is conducted when data are massive or the likelihood is intractable. The focus of the course is on correct usage of these methods rather than the detailed study of underlying theoretical arguments.

0.50
St. George