Stochastic Hopfield net Boltzmann machine is nothing but stochastic Hopfield net1. If you did not yet read the post of the Hopfield net in the blog, just go read it. I assume the readers are familiar to it, and directly use many results we had in the post. The magic of deep learning which we have discussed a couple of times works here, too. Such as $\epsilon$-greedy off-policy algorithm2, the stochastic character of the binary units allows the machine occasionally increase its energy to escape from poor local minima.

Continue reading

Judgement Day It is the first time I did not post for 4 days. I was too busy to prepare for the meetup this week. The day before yesterday meetup topic was the reinforcement learning as I mentioned at previous post. It is not a long research paper, but includes 143 references. Ah, not my favorite. This A Brief Survey of Deep Reinforcement Learning did not explain the detail of what I am interested in.

Continue reading

Binary Hopfield net using Hebbian learning We want to study Hopfield net from the simple case. Hopfield net is a fully connected feedback network. A feedback network is a network that is not a feedforward network, and in a feedforward network, all the connections are directed. All the connections in our example will be bi-directed. This symmetric property of the weight is important property of the Hopfield net. Hopfield net can act as associative memories, and they can be used to solve optimization problems.

Continue reading

Why Ising model : 3 reasons for relevance Studying Ising model can be useful to understand phase transition of various systems. Hopfield network or Boltzmann machine to the neural network is just a generalized form of Ising model. Ising model is also useful as a statistical model in its own right. Ising model $\boldsymbol{x}$ is the state of an Ising model with $N$ spins be a vector in which each component $\boldsymbol x_n$ takes values $-1$ or $+1$.

Continue reading

Single neuron still has a lot to say In the post of the first neural network tutorial, we studied a perceptron as a simple supervised learning machine. The perceptron is an amazing structure to understanding inference. In the post of the first neural network tutorial, I said I would leave you to find the objective function and and draw the plot of it. I just introduce here. Objective function and its contour plot.

Continue reading

Author's picture

Namshik Kim

physicist, data scientist

Data Scientist

Vancouver, BC, Canada.