Slice sampling algorithm A single transition $(x,u) \rightarrow (x',u')$ of a one-dimensional slice sampling algorithm has the following steps. (1). evaluate $P^* (x)$ (2). draw a vertical coordinate $u' \sim$ Uniform$(0,P^* (x))$ (3). create a horizontal interval $(x_l, x_r)$ enclosing $x$  3a. draw $r \sim$ Uniform$(0,1)$  3b. $x_l := x-rw$  3c. $x_r := x+(1-r)w$  3d. while $(P^* (x_l) > u')$ ${x_l := x-rw}$  3e. while $(P^* (x_r) > u')$ ${x_r:= x+w}$

Continue reading

Efficient Monte Carlo sampling This post is on the extension of the post about Hamiltonian Monte Carlo method. Therefore, I assume the readers already read the post. Overrelaxation also reduces the random property of the Monte Carlo sampling, and speeds up the convergence of the Markov chain. Gibbs sampling In advance of studying over relaxation, we study Gibbs sampling. In the general case of a system with K variables, a single iteration involves sampling one parameter at a time.

Continue reading

Yay! Finally something more directly from physics to data science. We will also have a chance to see how Metropolis-Hastings algorithm works! The Hamiltonian Monte Carlo method is a kind of Metropolis-Hastings method. One of the weak points of Monte Carlo sampling comes up with random walks. Hamiltonian Monte Carlo method (HMC) is an approach to reducing the randomizing in algorithm of the sampling. The original name was hybrid Monte Carlo method.

Continue reading

In advance, I will proceed in the extension of the previous post. I will use the same target distribution function and the similar Gaussian disposal distribution. Even Python script will be better understood if you’ve already read the previous post about importance sampling. The rejection sampling could be the most familiar Monte Carlo sampling. When need to introduce Monte Carlo method to somebody, it is very intuitive and effective to give an example of computing the area of the circle (or anything) by using random samples.

Continue reading

Importance sampling is the first sampling method I faced when I studied Monte Carlo method. Nevertheless, I haven’t seen many examples for the importance sampling. Maybe it is because the importance sampling is not effective for high dimensional systems. The weak point of the importance sampling is that the performance of it is determined by how well we choose the disposal distribution close to the target distribution. Here, I will present a simple example of the importance sampling.

Continue reading

Author's picture

Namshik Kim

physicist, data scientist

Data Scientist

Vancouver, BC, Canada.