![[Approximation_d'une_distribution_normale.gif|375]]
# Markov Chain Monte Carlo (MCMC)
**Markov Chain Monte Carlo (MCMC)** is a class of algorithms used to draw samples from a probability distribution.
MCMC is often used when we want to discover the nature of a probability distribution or integral that is typically extremely difficulty to analytically or numerically solve. Instead of trying to solve it directly, we opt to take a large number of samples using the function and combine the sampling aspects of [[monte carlo simulations]] with the unique conditional probability restrictions placed by the Markov property of markov chains.
They are similar to [[Hidden Markov Models (HMMs)]] in the sense that it will perform a *random walk* between the states of the [[markov chains|markov chain]].
Once we have a starting point, we randomly pick a nearby point and evaluate its probability.
- If it’s likelihood is higher than the starting point, we move there.
- Otherwise, we either stay put or still move to that point based on some additional probability.
Eventually, if this process is repeated enough (under the right conditions) we will hit every point in the space at a frequency proportional to its actual probability.
## Formal Definitions
> See also:
> - [[Markov Chains]]
### Markov Chain Structure
The Markov chain structure utilized during an MCMC run is quite literally just a chain.
It focuses more on the Markov property itself, rather than the possibility for more complex Markov chain structures (i.e. [[Hidden Markov Models (HMMs)]])
### Determining Sample Acceptance
An MCMC will not accept every sample that it picks.
## Variations of MCMC Sampling
> See also:
> - [[Probability Theory]]
Instead of trying to optimize the emission probabilities (likelihood of observations) to identify a single probable outcome, MCMC records the random walks that have occured during the simulation run and uses their resulting **probability density** to identify a variable’s *expected value* and *variance*.
The main advantage is that it can *escape local minima* and has been proven to be favorable over finding a single maximum likelihood model both in terms of accuracy and stability.
### The Metropolis-Hastings Algorithm
Was originally designed to aid in [[molecular dynamics simulations]].
### Gibbs Sampling
### Importance/Rejection Sampling
## MCMC For Parameter Inference