Home | Current | Past volumes | About | Login | Notify | Contact | Search
 Probability Surveys > Vol. 1 (2004) open journal systems 


General state space Markov chains and MCMC algorithms

Gareth O. Roberts, Lancaster University
Jeffrey S. Rosenthal, University of Toronto


Abstract
This paper surveys various results about Markov chains on general (non-countable) state spaces. It begins with an introduction to Markov chain Monte Carlo (MCMC) algorithms, which provide the motivation and context for the theory which follows. Then, sufficient conditions for geometric and uniform ergodicity are presented, along with quantitative bounds on the rate of convergence to stationarity. Many of these results are proved using direct coupling constructions based on minorisation and drift conditions. Necessary and sufficient conditions for Central Limit Theorems (CLTs) are also presented, in some cases proved via the Poisson Equation or direct regeneration constructions. Finally, optimal scaling and weak convergence results for Metropolis-Hastings algorithms are discussed. None of the results presented is new, though many of the proofs are. We also describe some Open Problems.

Creative Common LOGO

Full Text: PDF


Roberts, Gareth O., Rosenthal, Jeffrey S., General state space Markov chains and MCMC algorithms, Probability Surveys, 1, (2004), 20-71 (electronic).

References

[1]   D.J. Aldous, (1983), Random walk on finite groups and rapidly mixing Markov chains. Séminaire de Probabilités XVII. Lecture Notes in Math. 986, 243–297. Springer, New York. MR770418

[2]   D.J. Aldous and P. Diaconis (1987), Strong stopping times and finite random walks. Adv. Appl. Math. 8, 69–97. MR876954

[3]   D.J. Aldous and H. Thorisson (1993), Shift-coupling. Stoch. Proc. Appl. 44, 1–14. MR1198659

[4]   S. Asmussen (1987), Applied Probability and Queues. John Wiley & Sons, New York. MR889893

[5]   Y.F. Atchadé and J.S. Rosenthal (2003), On Adaptive Markov Chain Monte Carlo Algorithms. Submitted.

[6]   Y.F. Atchadé and J.S. Rosenthal (2003), Central Limit Theorems for Geometrically Ergodic Markov chains. Work in progress.

[7]   K.B. Athreya, H. Doss, and J. Sethuraman (1996), On the Convergence of the Markov Chain Simulation Method. Ann. Stat. 24, 69–100. MR1389881

[8]   K.B. Athreya and P. Ney (1978), A new approach to the limit theory of recurrent Markov chains. Trans. Amer. Math. Soc. 245, 493–501. MR511425

[9]   P.H. Baxendale (2003), Renewal theory and computable convergence rates for geometrically ergodic Markov chains. Preprint, University of Southern California. MR2049402

[10]   M. Bédard (2004), On the robustness of optimal scaling for Metropolis-Hastings algorithms. Ph.D. dissertation, University of Toronto. Work in progress.

[11]   K. Berenhaut and R.B. Lund (2001), Geometric renewal convergence rates from hazard rates. J. Appl. Prob. 38, 180–194. MR1816122

[12]   P. Billingsley (1961), The Lindeberg-Lévy theorem for martingales. Proc. Amer. Math. Soc. 12, 788–792. MR126871

[13]   P. Billingsley (1995), Probability and Measure, 3rd ed. John Wiley & Sons, New York. MR1324786

[14]   L. Breyer and G.O. Roberts (2000), From Metropolis to diffusions: Gibbs states and optimal scaling. Stoch. Proc. Appl. 90, 181–206. MR1794535

[15]   K.S. Chan and C.J. Geyer (1994), Discussion to reference [93]. Ann. Stat. 22, 1747–1758. MR1329179

[16]   O.F. Christensen, G.O. Roberts, and J.S. Rosenthal (2003), Scaling Limits for the Transient Phase of Local Metropolis-Hastings Algorithms. Submitted.

[17]   R. Cogburn (1972), The central limit theorem for Markov processes. In Proc. Sixth Berkeley Symp. Math. Statist. Probab. 2, 485–512.

[18]   M.K. Cowles and B.P. Carlin (1996), Markov Chain Monte Carlo Convergence Diagnostics: A Comparative Review. J. Amer. Stat. Assoc. 91, 883–904. MR1395755

[19]   M.K. Cowles, G.O. Roberts, and J.S. Rosenthal (1999), Possible biases induced by MCMC convergence diagnostics. J. Stat. Comp. Sim. 64, 87–104. MR1741840

[20]   M.K. Cowles and J.S. Rosenthal (1998), A simulation approach to convergence rates for Markov chain Monte Carlo algorithms. Statistics and Computing 8, 115–124.

[21]   P. Diaconis (1988), Group representations in Probability and Statistics. Institute of Mathematical Statistics, Hayward, California. MR964069

[22]   W. Doeblin (1938), Exposé de la théorie des châines simples constantes de Markov à un nombre fini d’états. Revue Mathematique de l’Union Interbalkanique 2, 77–105.

[23]   J.I. Doob (1953), Stochastic Processes. Wiley, New York. MR58896

[24]    R. Douc, E. Moulines, and J.S. Rosenthal (2002), Quantitative bounds on convergence of time-inhomogeneous Markov Chains. Ann. Appl. Prob., to appear.

[25]   R. Durrett (1991), Probability: theory and examples. Wadsworth, Pacific Grove, California. MR1068527

[26]   S.N. Ethier and T.G. Kurtz (1986), Markov processes, characterization and convergence. Wiley, New York. MR838085

[27]   J.A. Fill, M. Machida, D.J. Murdoch, and J.S. Rosenthal (2000), Extension of Fill’s perfect rejection sampling algorithm to general chains. Random Struct. Alg. 17, 290–316. MR1801136

[28]   G. Fort (2003), Computable bounds for V-geometric ergodicity of Markov transition kernels. Preprint, Université Joseph Fourier, Grenoble, France.

[29]   G. Fort and E. Moulines (2003), Polynomial ergodicity of Markov transition kernels. Stoch. Proc. Appl. 103, 57–99. MR1947960

[30]   A.E. Gelfand and A.F.M. Smith (1990), Sampling based approaches to calculating marginal densities. J. Amer. Stat. Assoc. 85, 398–409. MR1141740

[31]   A. Gelman and D.B. Rubin (1992), Inference from iterative simulation using multiple sequences. Stat. Sci., Vol. 7, No. 4, 457-472.

[32]   S. Geman and D. Geman (1984), Stochastic relaxation, Gibbs distributions and the Bayesian restoration of images. IEEE Trans. on pattern analysis and machine intelligence 6, 721–741.

[33]   W.R. Gilks, S. Richardson, and D.J. Spiegelhalter, ed. (1996), Markov chain Monte Carlo in practice. Chapman and Hall, London. MR1397966

[34]   C. Geyer (1992), Practical Markov chain Monte Carlo. Stat. Sci., Vol. 7, No. 4, 473-483.

[35]   W.R. Gilks, G.O. Roberts and S.K. Sahu (1998), Adaptive Markov Chain Monte Carlo, J. Am. Stat. Assoc., 93, 1045–1054. MR1649199

[36]   O. Häggström (2004), On the central limit theorem for geometrically ergodic Markov chains. Prob. Th. Rel. Fields, to appear. (Available at http://www.math.chalmers.se/~olleh/papers.html.)

[37]   W.K. Hastings (1970), Monte Carlo sampling methods using Markov chains and their applications. Biometrika 57, 97–109.

[38]   J.P. Hobert, G.L. Jones, B. Presnell, and J.S. Rosenthal (2002), On the Applicability of Regenerative Simulation in Markov Chain Monte Carlo. Biometrika 89, 731–743. MR1946508

[39]   I.A. Ibragimov (1963), A central limit theorem for a class of dependent random variables. Theory Prob. Appl. 8, 83–89. MR151997

[40]   I.A. Ibragimov and Y.V. Linnik (1971), Independent and Stationary Sequences of Random Variables. Wolters-Noordhoff, Groningen (English translation). MR322926

[41]   N. Jain and B. Jamison (1967), Contributions to Doeblin’s theory of Markov processes. Z. Wahrsch. Verw. Geb. 8, 19–40. MR221591

[42]   S.F. Jarner and G.O. Roberts (2002), Polynomial convergence rates of Markov chains. Ann. Appl. Prob., 224–247, 2002. MR1890063

[43]   G.L. Jones (2004), On the Central Limit Theorem for Markov Chains (review paper). Work in progress.

[44]   G.L. Jones and J.P. Hobert (2001), Honest exploration of intractable probability distributions via Markov chain Monte Carlo. Statistical Science 16, 312–334. MR1888447

[45]   G.L. Jones and J.P. Hobert (2004), Sufficient burn-in for Gibbs samplers for a hierarchical random effects model. Ann. Stat. 32, 784–817. MR2060178

[46]   W. Kendall and J. Møller (2000), Perfect simulation using dominating processes on ordered state spaces, with application to locally stable point processes. Adv. Appl. Prob. 32, 844–865. MR1788098

[47]   C. Kipnis and S.R.S. Varadhan (1986), Central limit theorem for additive functionals of reversible Markov processes and applications to simple exclusions. Comm. Math. Phys. 104, 1-19. MR834478

[48]   T. Lindvall (1992), Lectures on the Coupling Method. Wiley & Sons, New York. MR1180522

[49]   R. Lund, S.P. Meyn, and R.L. Tweedie (1996), Rates of convergence of stochastically monotone Markov processes. Ann. App. Prob. 6, 218–237. MR1389838

[50]   A.A. Markov (1906), Extension of the law of large numbers to dependent quantities (in Russian). Izv. Fiz.-Matem. Obsch. Kazan Univ. (2nd Ser) 15, 135–156.

[51]   P. Matthews (1993), A slowly mixing Markov chain with implications for Gibbs sampling. Stat. Prob. Lett. 17, 231–236. MR1229942

[52]   K. L. Mengersen and R. L. Tweedie (1996), Rates of convergence of the Hastings and Metropolis algorithms. Ann. Statist. 24, 101–121. MR1389882

[53]   N. Metropolis, A. Rosenbluth, M. Rosenbluth, A. Teller, and E. Teller (1953), Equations of state calculations by fast computing machines. J. Chem. Phys. 21, 1087–1091.

[54]   S.P. Meyn and R.L. Tweedie (1993), Markov chains and stochastic stability. Springer-Verlag, London. Available at http://decision.csl.uiuc.edu/~meyn/pages/TOC.html. MR1287609

[55]   S.P. Meyn and R.L. Tweedie (1994), Computable bounds for convergence rates of Markov chains. Ann. Appl. Prob. 4, 981–1011. MR1304770

[56]   J. Møller, A.R. Syversveen, and R. Waagepetersen (1998), Log Gaussian Cox processes. Scand. J. Statist. 25, 451–482. MR1650019

[57]   P.A. Mykland, L. Tierney, and B. Yu (1995), Regeneration in Markov chain samplers. J. Amer. Stat. Assoc. 90, 233–241. MR1325131

[58]   R.M. Neal (2004), Improving Asymptotic Variance of MCMC Estimators: Non-reversible Chains are Better. Technical Report No. 0406, Dept. of Statistics, University of Toronto.

[59]   E. Nummelin (1978), Uniform and ratio limit theorems for Markov renewal and semi-regenerative processes on a general state space. Ann. Inst. Henri Poincaré Series B 14, 119–143. MR507729

[60]   E. Nummelin (1984), General irreducible Markov chains and non-negative operators. Cambridge University Press. MR776608

[61]   S. Orey (1971), Lecture notes on limit theorems for Markov chain transition probabilities. Van Nostrand Reinhold, London. MR324774

[62]   J.W. Pitman (1976), On coupling of Markov chains. Z. Wahrsch. verw. Gebiete 35, 315–322. MR415775

[63]   J.G. Propp and D.B. Wilson (1996). Exact sampling with coupled Markov chains and applications to statistical mechanics. Random Structures and Algorithms 9, 223–252. MR1611693

[64]   D. Randall (2003), Mixing, a tutorial on Markov chains. FOCS 2003.

[65]   G.O. Roberts (1998), Optimal Metropolis algorithms for product measures on the vertices of a hypercube. Stochastics and Stochastic Reports 62, 275–283. MR1613256

[66]   G.O. Roberts (1999), A note on acceptance rate criteria for CLTs for Metropolis-Hastings algorithms. J. Appl. Prob. 36, 1210–1217. MR1742161

[67]   G.O. Roberts, A. Gelman, and W.R. Gilks (1997), Weak convergence and optimal scaling of random walk Metropolis algorithms. Ann. Appl. Prob. 7, 110–120. MR1428751

[68]   G.O. Roberts and J.S. Rosenthal (1997), Shift-coupling and convergence rates of ergodic averages. Stochastic Models 13, 147–165. MR1430933

[69]   G.O. Roberts and J.S. Rosenthal (1997), Geometric ergodicity and hybrid Markov chains. Electronic Comm. Prob. 2, Paper no. 2, 13–25. MR1448322

[70]   G.O. Roberts and J.S. Rosenthal (1998), Optimal scaling of discrete approximations to Langevin diffusions. J. Roy. Stat. Soc. B 60, 255–268. MR1625691

[71]   G.O. Roberts and J.S. Rosenthal (1998), Markov chain Monte Carlo: Some practical implications of theoretical results (with discussion). Canadian J. Stat. 26, 5–31. MR1624414

[72]   G.O. Roberts and J.S. Rosenthal (2001), Small and Pseudo-Small Sets for Markov Chains. Stochastic Models 17, 121–145. MR1853437

[73]   G.O. Roberts and J.S. Rosenthal (2001), Markov chains and de-initialising processes. Scandinavian Journal of Statistics 28, 489–504. MR1858413

[74]   G.O. Roberts and J.S. Rosenthal (2001), Optimal scaling for various Metropolis-Hastings algorithms. Stat. Sci. 16, 351–367. MR1888450

[75]   G.O. Roberts and J.S. Rosenthal (2003), Harris Recurrence of Metropolis-Within-Gibbs and Transdimensional MCMC Algorithms. In preparation.

[76]   G.O. Roberts and R.L. Tweedie (1996), Geometric Convergence and Central Limit Theorems for Multidimensional Hastings and Metropolis Algorithms. Biometrika 83, 95–110. MR1399158

[77]   G.O. Roberts and R.L. Tweedie (1999), Bounds on regeneration times and convergence rates for Markov chains. Stoch. Proc. Appl. 80, 211–229. See also the corrigendum, Stoch. Proc. Appl. 91 (2001), 337–338. MR1682243

[78]   J.S. Rosenthal (1993), Rates of convergence for data augmentation on finite sample spaces. Ann. Appl. Prob. 3, 819–839. MR1233628

[79]   J.S. Rosenthal (1995), Rates of convergence for Gibbs sampler for variance components models. Ann. Stat. 23, 740–761. MR1345197

[80]   J.S. Rosenthal (1995), Minorization conditions and convergence rates for Markov chain Monte Carlo. J. Amer. Stat. Assoc. 90, 558–566. MR1340509

[81]   J.S. Rosenthal (1995), Convergence rates of Markov chains. SIAM Review 37, 387–405. MR1355507

[82]   J.S. Rosenthal (1996), Convergence of Gibbs sampler for a model related to James-Stein estimators. Stat. and Comput. 6, 269–275.

[83]   J.S. Rosenthal (2000), A first look at rigorous probability theory. World Scientific Publishing, Singapore. MR1767078

[84]   J.S. Rosenthal (2001), A review of asymptotic convergence for general state space Markov chains. Far East J. Theor. Stat. 5, 37–50. MR1848443

[85]   J.S. Rosenthal (2002), Quantitative convergence rates of Markov chains: A simple account. Elec. Comm. Prob. 7, No. 13, 123–128. MR1917546

[86]   J.S. Rosenthal (2003), Asymptotic Variance and Convergence Rates of Nearly-Periodic MCMC Algorithms. J. Amer. Stat. Assoc. 98, 169–177. MR1965683

[87]   J.S. Rosenthal (2003), Geometric Convergence Rates for Time-Sampled Markov Chains. J. Theor. Prob. 16, 671–688. MR2009198

[88]   A. Sinclair (1992), Improved bounds for mixing rates of Markov chains and multicommodity flow. Combinatorics, Prob., Comput. 1, 351–370. MR1211324

[89]   A.F.M. Smith and G.O. Roberts (1993), Bayesian computation via the Gibbs sampler and related Markov chain Monte Carlo methods (with discussion). J. Roy. Stat. Soc. Ser. B 55, 3–24. MR1210421

[90]   C. Stein (1971), A bound for the error in the normal approximation to the distribution of a sum of dependent random variables. In Proc. Sixth Berkeley Symp. Math. Statist. Prob. 3, 583–602. University of California Press. MR402873

[91]   H. Thorisson (2000), Coupling, Stationarity, and Regeneration. Springer, New York. MR1741181

[92]   E. Thönnes, A Primer in Perfect Simulation. In Statistical Physics and Spatial Statistics (K.R. Mecke and D. Stoyan, ed.), Springer Lecture Notes in Physics, 349–378. MR1870952

[93]   L. Tierney (1994), Markov chains for exploring posterior distributions (with discussion). Ann. Stat. 22, 1701–1762. MR1329166

[94]   P. Tuominen and R.L. Tweedie (1994), Subgeometric rates of convergence of f-ergodic Markov chains. Adv. Appl. Prob. 26, 775–798. MR1285459




Home | Current | Past volumes | About | Login | Notify | Contact | Search

Probability Surveys. ISSN: 1549-5787