Mixture Normal Distribution for Gibbs Sampler and its Application in the Surface of Single Crystal

Article Preview

Abstract:

Gibbs sampler is widely used in Bayesian analysis. But it is often difficult to sample from the full conditional distribution, and this hardly weakens the efficiency of Gibbs sampler. In this paper, we propose to use mixture normal distribution for Gibbs sampler. The mixture normal distribution can approximate the target distribution. So carrying more information from target distribution, the mixture normal distribution tremendously improves the efficiency of Gibbs sampler. Further more, combining with mixture normal method, Hit-and-Run algorithm can also get more efficient sampling results. Simulation results show that Gibbs sampler with mixture normal distribution outperforms other sampling algorithms. The Gibbs sampler with mixture normal distribution can also be applied to explorer the surface of single crystal.

You might also be interested in these eBooks

Info:

Periodical:

Pages:

585-589

Citation:

Online since:

June 2012

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2012 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

[1] M. H. Chen and B. Schmeiser: Performance of the Gibbs, Hit-and-Run, and Metropolis samplers. Journal of Computational and Graphical Statistics. 2: 251-272. (1998).

DOI: 10.2307/1390645

Google Scholar

[2] G. S. Fishman: Monte Carlo: concepts, algorithm, and applications. Springer. New York. (1995).

Google Scholar

[3] A. Gelmen, G. O. Roberts and W. R. Gilks: Efficient Metropolis jumping rules. Bayesian Statistics. 5: 599-607. (1996).

Google Scholar

[4] C. J. Geyer: Markov Chain Monte Carlo maximum likelihood. Computing Science and Statistic: The 23th symposium on the interface. 156-163. (1991).

Google Scholar

[5] W. Hardle, M. Muller, S. Sperlich and A. Werwatz: Nonparametric and Semiparametric Models. Springer. New York. (2004).

Google Scholar

[6] J. D. Hart: Nonparametric Smoothing and Lack-of-Fit Tests. Springer. New York (1997).

Google Scholar

[7] W. K. Hastings: Monte carlo sampling methods using markov chains and their applications. Biometrika. 57: 97-109. (1970).

DOI: 10.1093/biomet/57.1.97

Google Scholar

[8] F. M. Liang, G. H. Liu, and R. J. Carrol: Advanced Markov Chain Monte Carlo methods: learning from past samples. Wiley, New York. (2010).

DOI: 10.1002/9780470669723

Google Scholar

[9] F. M. Liang and H. W. Wong: Real-parameter evolutionary Monte Carlo with application to Bayesian mixture models. Journal of the American Statistical Association. 95: 121-134. (2001).

DOI: 10.1198/016214501753168325

Google Scholar

[10] J. S. Liu: Monte Carlo strategies in scientific computing. Springer. New Youk. (2001).

Google Scholar

[11] N. Metropolis, A. W. Rosenbluth, M. N. Rosenbluth MN,A. H. Teller, and E. Teller: Equations of state calculations by fast computing machines. Journal of Chemical Physics. 21: 1087-1091. (1953).

DOI: 10.1063/1.1699114

Google Scholar

[12] W. Shao, G. B. Guo, F. Y. Meng and Y. J. Gai: Efficient proposal distribution in Metropolis-Hastings algorithm using a B-splines technique. Manuscript. (2011).

Google Scholar

[13] W. Shao, G. Q. Zhao and L. P. Zhu: Mixture normal proposal for Metropolis-Hastings algorithm. Manuscript. (2012).

Google Scholar

[14] G. O. Robert, A. Gelman andW. R. Gilks: Weak convergence and optimal scaling of random walk Metropolis algorithm. The Annals of Applied Probability. 7: 110-120. (1997).

DOI: 10.1214/aoap/1034625254

Google Scholar