Talk:Gibbs algorithm: Difference between revisions
fixed |
comments from User:146.50.1.141 |
||
Line 1: | Line 1: | ||
{{physics|class=stub|importance=low}} |
{{physics|class=stub|importance=low}} |
||
==Gibbs measure== |
|||
This page mentions two things: The Gibbs Algorithm and the Gibbs Distribution. In my opinion, both are important and should be separated. According to various Markov Random Field literature, the Gibbs distribution takes the form of: |
This page mentions two things: The Gibbs Algorithm and the Gibbs Distribution. In my opinion, both are important and should be separated. According to various Markov Random Field literature, the Gibbs distribution takes the form of: |
||
Line 11: | Line 12: | ||
is a normalizing factor. '''T''' is a constant called the temperature, and U(f) is an energy function. For a specific choice of U(f), this leads to the (Gaussian) [[Normal_distribution]]. |
is a normalizing factor. '''T''' is a constant called the temperature, and U(f) is an energy function. For a specific choice of U(f), this leads to the (Gaussian) [[Normal_distribution]]. |
||
Maybe the Gibbs distrubution should redirect to the [[Gibbs_measure]] |
Maybe the Gibbs distrubution should redirect to the [[Gibbs_measure]] (''Unsigned, [[User:146.50.1.141]], January 2007'') |
||
:Now fixed. [[User:Linas|linas]] ([[User talk:Linas|talk]]) 21:16, 30 August 2008 (UTC) |
:Now fixed. [[User:Linas|linas]] ([[User talk:Linas|talk]]) 21:16, 30 August 2008 (UTC) |
||
== |
==Gibbs Algorithm vs Gibbs Sampler == |
||
This article states that the Gibbs Algorithm is different from the Gibbs Sampler. But I encountered various interpretations of Markov Random Fields in terms of maximizing the Entropy, which is often defined as |
This article states that the Gibbs Algorithm is different from the Gibbs Sampler. But I encountered various interpretations of Markov Random Fields in terms of maximizing the Entropy, which is often defined as |
||
<math> H = -\sum_i p_i \log p_i </math> |
<math> H = -\sum_i p_i \log p_i </math> |
||
This makes the Gibbs algorithm probably a special case of Markov chain Monte Carlo iterations. For an interpretation of Markov Random Fields in terms of Entropy see for example here [http://www.ams.org/online_bks/conm1/] op page 5/6. |
This makes the Gibbs algorithm probably a special case of Markov chain Monte Carlo iterations. For an interpretation of Markov Random Fields in terms of Entropy see for example here [http://www.ams.org/online_bks/conm1/] op page 5/6. (''Unsigned, [[User:146.50.1.141]], January 2007'') |
Revision as of 21:18, 30 August 2008
![]() | Physics Stub‑class Low‑importance | |||||||||
|
Gibbs measure
This page mentions two things: The Gibbs Algorithm and the Gibbs Distribution. In my opinion, both are important and should be separated. According to various Markov Random Field literature, the Gibbs distribution takes the form of:
where
is a normalizing factor. T is a constant called the temperature, and U(f) is an energy function. For a specific choice of U(f), this leads to the (Gaussian) Normal_distribution.
Maybe the Gibbs distrubution should redirect to the Gibbs_measure (Unsigned, User:146.50.1.141, January 2007)
- Now fixed. linas (talk) 21:16, 30 August 2008 (UTC)
Gibbs Algorithm vs Gibbs Sampler
This article states that the Gibbs Algorithm is different from the Gibbs Sampler. But I encountered various interpretations of Markov Random Fields in terms of maximizing the Entropy, which is often defined as
This makes the Gibbs algorithm probably a special case of Markov chain Monte Carlo iterations. For an interpretation of Markov Random Fields in terms of Entropy see for example here [1] op page 5/6. (Unsigned, User:146.50.1.141, January 2007)