Hamiltonian Monte Carlo Sampling

Metropolis, Slice or Gibbs sampling tend to be inefficient when applied in high dimensions due to their local random walk behaviour. Hamiltonian Monte Carlo (HMC) was developed around a geometrical understanding of the target distribution, borrowing concepts from physics to generate transitions and exploring the target distribution. Ultimately, moving more rapidly through the target distribution. HMC is also often the basis for implementations of MCMC algorithms such as in STAN or PyMC3.In this talk we’ll motivate and introduce the algorithm itself, its benefits but also its shortcomings.

Useful links

References

  • A Conceptual Introduction to Hamiltonian Monte Carlo, Michael Betancourt. arXiv:1701.02434 [stat] (2018)
  • The No-U-Turn Sampler: Adaptively Setting Path Lengths in Hamiltonian Monte Carlo, Matthew D. Hoffman, Andrew Gelman. Journal of Machine Learning Research (2014)

In this series