DESk-McMC
differential Evolution Markov Chain Monte Carlo with Selection Mechanisms
DOI:
https://doi.org/10.5540/03.2025.011.01.0376Keywords:
Markov chain Monte Carlo, Differential Evolution, Bayesian Analysis, Random Walk, Metropolis AlgorithmAbstract
Metropolis is still one of the most popular algorithms used in the Bayesian analysis of stochastic problems. It is often used when the a priori knowledge of the target distribution is quite limited. However, the shape and size of the proposal distribution are known to be very crucial for the convergence of the algorithm. For example, the classical random-walk jump can often face convergence problems. In that sense, Differential Evolution Markov chain Monte Carlo (DE) is an interesting alternative but can also have a low acceptance rate. Inspired by genetic algorithm concepts, this work presents a new version of the DE algorithm in which a selection step is introduced. The new methodology, DESk-McMC, is applied to a simple Bayesian inference problem identified here as a polynomial “Black Box”. Different values of selection pressure are studied. The results showed that the inclusion of the selection step significantly increased the average acceptance rate of Markov chains.
Downloads
References
M. R. Borges and F. Pereira. “A novel approach for subsurface characterization of coupled fluid flow and geomechanical deformation: the case of slightly compressible flows”. In: Computational Geosciences 24 (2020), pp. 1693–1706.
N. Metropolis, A. W. Rosenbluth, M. N. Rosenbluth, A.H. Teller, and E. Teller. “Equation of state calculations by fast computing machines”. In: The Journal of Chemical Physics 21 (1953), pp. 1087–1092.
W. K. Hastings. “Monte carlo sampling methods using markov chains and their applications”. In: Biometrika 57 (1970), pp. 97–109.
C. Nemeth and P. Fearnhead. “Stochastic Gradient Markov Chain Monte Carlo”. In: Journal of the American Statistical Association 116.533 (2021), pp. 433–450.
H. Haario, E. Saksman, and J. Tamminen. “Adaptive proposal distribution for random walk Metropolis algorithm”. In: Computational Statistics 14.3 (1999), pp. 375–395.
A. Gelman, G. O. Roberts, and W. R. Gilks. “Efficient Metropolis Jumping Rules”. In: Bayesian Statistics. Ed. by J. M. Bernardo, J. O. Berger, A. P. Dawid, and A. F. M. Smith. Oxford University Press, Oxford, 1996, pp. 599–608.
C. J. F. T. Braak. “A Markov Chain Monte Carlo version of the genetic algorithm Differential Evolution: easy Bayesian computing for real parameter spaces”. In: Statistics and Computing 16 (2006), pp. 239–249.
G. O. Roberts and J. S. Rosenthal. “Optimal scaling for various metropolis-hastings algorithms”. In: Statistical Science 16 (2001), pp. 351–367.
S. P. Brooks and A. Gelman. “General Methods for Monitoring Convergence of Iterative Simulations”. In: Journal of Computational and Graphical Statistics 7 (1998), pp. 434–455.
J. A. Vrugt, C.J.F. ter Braak, C.G.H. Diks, B. A. Robinson, J. M. Hyman, and D. Higdon. “Accelerating Markov Chain Monte Carlo Simulation by Differential Evolution with Self-Adaptive Randomized Subspace Sampling”. In: International Journal of Nonlinear Sciences and Numerical Simulation 10 (2009), pp. 273–290.
J. A. Vrugt, C. J. F. Ter Braak, M. P. Clark, J. M. Hyman, and B. A. Robinson. “Treatment of input uncertainty in hydrologic modeling: Doing hydrology backward with Markov chain Monte Carlo simulation”. In: Water Resources Research 44.12 (2008). W00B09, n/a–n/a. issn: 1944-7973.
D. Vats and C. Knudson. “Revisiting the Gelman–Rubin Diagnostic”. In: Statistical Science 36 (2021), pp. 518–529.