Jump to content

Gibbs algorithm: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Rewrote lede to be slightly clearer
Added citations, removed tags, added Distinguish hatnote
Line 1: Line 1:
{{Distinguish |Gibbs sampler}}
{{Unreferenced|date=December 2009}}
{{Disputed|date=March 2008}}


In [[statistical mechanics]], the '''Gibbs algorithm''', introduced by [[J. Willard Gibbs]] in 1878, is a criterion for choosing a [[probability distribution]] for the [[statistical ensemble]] of [[microstate (statistical mechanics)|microstate]]s of a [[thermodynamic system]] by maximising the average negative log probability
In [[statistical mechanics]], the '''Gibbs algorithm''', introduced by [[J. Willard Gibbs]] in 1902, is a criterion for choosing a [[probability distribution]] for the [[statistical ensemble]] of [[microstate (statistical mechanics)|microstate]]s of a [[thermodynamic system]] by minimizing the average log probability
(or [[entropy_(Information_theory)|information-theoretic entropy]])


:<math> H = \sum_i -p_i \ln p_i \, </math>
:<math> \langle\ln p_i\rangle = \sum_i p_i \ln p_i \, </math>


subject to the probability distribution ''p_i'' satisfying a set of constraints (usually expectation values) corresponding to the known [[macroscopic]] quantities. Physicists call the result of applying the Gibbs algorithm the [[Gibbs distribution]] for the given constraints, most notably Gibbs's [[grand canonical ensemble]] for open systems when the average energy and the average number of particles are given. (See also ''[[Partition function (mathematics)|partition function]]'').
subject to the probability distribution {{math|''p<sub>i</sub>''}} satisfying a set of constraints (usually expectation values) corresponding to the known [[macroscopic]] quantities.<ref name=Dewar>{{cite book|first=Roderick C. |last=Dewar|chapter=4. Maximum Entropy Production and Non-equilibrium Statistical Mechanics|editor-last1=Kleidon|editor-first1=A.|title=Non-equilibrium thermodynamics and the production of entropy : life, earth, and beyond|date=2005|publisher=Springer|location=Berlin|isbn=9783540224952|pages=41&ndash;55|doi=10.1007/11672906_4}}</ref> in 1948, [[Claude E. Shannon|Claude Shannon]] interpreted the negative of this quantity, which he called [[entropy_(Information_theory)|information entropy]], as a measure of the uncertainty in a probability distribution.<ref name=Dewar/> In 1957, [[E.T. Jaynes]] realized that this quantity could be interpreted as missing information about anything, and generalized the Gibbs algorithm to non-equilibrium systems with the [[principle of maximum entropy]] and [[maximum entropy thermodynamics]].<ref name=Dewar/>


Physicists call the result of applying the Gibbs algorithm the [[Gibbs distribution]] for the given constraints, most notably Gibbs's [[grand canonical ensemble]] for open systems when the average energy and the average number of particles are given. (See also ''[[Partition function (mathematics)|partition function]]'').
In the light of [[Claude E. Shannon|Claude Shannon]]'s [[information theory]], in 1957 [[E.T. Jaynes]] re-interpreted the Gibbs algorithm as a much more general, more widely applicable inference technique, leading to the [[principle of maximum entropy]], and the [[Maximum entropy thermodynamics|MaxEnt view of thermodynamics]].


This general result of the Gibbs algorithm is then a [[maximum entropy probability distribution]]. Statisticians identify such distributions as belonging to [[exponential family|exponential families]].
This general result of the Gibbs algorithm is then a [[maximum entropy probability distribution]]. Statisticians identify such distributions as belonging to [[exponential family|exponential families]].


==References==
==Not to be confused with==
{{Reflist}}
The [[Gibbs sampler]], an update algorithm used in [[Markov chain Monte Carlo]] iterations, a special case of the [[Metropolis-Hastings algorithm]].


{{DEFAULTSORT:Gibbs Algorithm}}
{{DEFAULTSORT:Gibbs Algorithm}}

Revision as of 06:53, 13 September 2015

In statistical mechanics, the Gibbs algorithm, introduced by J. Willard Gibbs in 1902, is a criterion for choosing a probability distribution for the statistical ensemble of microstates of a thermodynamic system by minimizing the average log probability

subject to the probability distribution pi satisfying a set of constraints (usually expectation values) corresponding to the known macroscopic quantities.[1] in 1948, Claude Shannon interpreted the negative of this quantity, which he called information entropy, as a measure of the uncertainty in a probability distribution.[1] In 1957, E.T. Jaynes realized that this quantity could be interpreted as missing information about anything, and generalized the Gibbs algorithm to non-equilibrium systems with the principle of maximum entropy and maximum entropy thermodynamics.[1]

Physicists call the result of applying the Gibbs algorithm the Gibbs distribution for the given constraints, most notably Gibbs's grand canonical ensemble for open systems when the average energy and the average number of particles are given. (See also partition function).

This general result of the Gibbs algorithm is then a maximum entropy probability distribution. Statisticians identify such distributions as belonging to exponential families.

References

  1. ^ a b c Dewar, Roderick C. (2005). "4. Maximum Entropy Production and Non-equilibrium Statistical Mechanics". In Kleidon, A. (ed.). Non-equilibrium thermodynamics and the production of entropy : life, earth, and beyond. Berlin: Springer. pp. 41–55. doi:10.1007/11672906_4. ISBN 9783540224952.