Jump to content

Talk:Gibbs algorithm: Difference between revisions

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
Content deleted Content added
No edit summary
No edit summary
Line 16: Line 16:
<math> H = -\sum_i p_i \log p_i </math>
<math> H = -\sum_i p_i \log p_i </math>


This makes the Gibbs algorithm probably a special case of Markov chain Monte Carlo iterations. For an interpretation of Markov Random Fields in terms of Entropy see for example here [http://www.ams.org/online_bks/conm1/].
This makes the Gibbs algorithm probably a special case of Markov chain Monte Carlo iterations. For an interpretation of Markov Random Fields in terms of Entropy see for example here [http://www.ams.org/online_bks/conm1/] op page 5/6.

Revision as of 16:53, 4 January 2007

WikiProject iconPhysics Unassessed
WikiProject iconThis article is within the scope of WikiProject Physics, a collaborative effort to improve the coverage of Physics on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
???This article has not yet received a rating on Wikipedia's content assessment scale.
???This article has not yet received a rating on the project's importance scale.

This page mentions two things: The Gibbs Algorithm and the Gibbs Distribution. In my opinion, both are important and should be separated. According to various Markov Random Field literature, the Gibbs distribution takes the form of:

where

is a normalizing factor. T is a constant called the temperature, and U(f) is an energy function. For a specific choice of U(f), this leads to the (Gaussian) Normal_distribution.

Gibbs Algorithm vs Gibbs Sampler

This article states that the Gibbs Algorithm is different from the Gibbs Sampler. But I encountered various interpretations of Markov Random Fields in terms of maximizing the Entropy, which is often defined as

This makes the Gibbs algorithm probably a special case of Markov chain Monte Carlo iterations. For an interpretation of Markov Random Fields in terms of Entropy see for example here [1] op page 5/6.