

NUTS also has several self-tuning strategies for adaptively setting the tunable parameters of Hamiltonian Monte Carlo, which means you usually don’t need to have specialized knowledge about how the algorithms work. HMC and NUTS take advantage of gradient information from the likelihood to achieve much faster convergence than traditional sampling methods, especially for larger models. This class of samplers works well on high-dimensional and complex posterior distributions and allows many complex models to be fit without specialized knowledge about fitting algorithms. It features next-generation Markov chain Monte Carlo (MCMC) sampling algorithms such as the No-U-Turn Sampler (NUTS Hoffman, 2014), a self-tuning variant of Hamiltonian Monte Carlo (HMC Duane, 1987). PyMC is a PP framework with an intuitive and readable, yet powerful, syntax that is close to the natural syntax statisticians use to describe models.

Probabilistic programming (PP) allows flexible specification of Bayesian statistical models in code. This paper is a tutorial-style introduction to this software package for those already somewhat familiar with Bayesian statistics.

PyMC allows for model specification in Python code, rather than in a domain-specific language, making it easy to learn, customize, and debug. PyMC is an open source probabilistic programming framework written in Python that uses Aesara to compute gradients via automatic differentiation, as well as compiling probabilistic programs on-the-fly to one of a suite of computational backends for increased speed. Gradient-based algorithms for Markov chain Monte Carlo (MCMC) sampling, known as Hamiltonian Monte Carlo (HMC), allow inference on increasingly complex models but requires gradient information that is often not trivial to calculate. Probabilistic Programming allows for automatic Bayesian inference on user-defined probabilistic models. Note: This text is partly based on the PeerJ CS publication on PyMC by John Salvatier, Thomas V.
