Jump to content

Gibbs algorithm: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
No edit summary
Rewrote lede to be slightly clearer
Line 2: Line 2:
{{Disputed|date=March 2008}}
{{Disputed|date=March 2008}}


In [[statistical mechanics]], the '''Gibbs algorithm''', first introduced by [[J. Willard Gibbs]] in 1878, is the injunction to choose a [[statistical ensemble]] (probability distribution) for the unknown [[microstate (statistical mechanics)|microscopic state]] of a [[thermodynamic system]] by minimising the average log probability
In [[statistical mechanics]], the '''Gibbs algorithm''', introduced by [[J. Willard Gibbs]] in 1878, is a criterion for choosing a [[probability distribution]] for the [[statistical ensemble]] of [[microstate (statistical mechanics)|microstate]]s of a [[thermodynamic system]] by maximising the average negative log probability
(or [[entropy_(Information_theory)|information-theoretic entropy]])


:<math> H = \sum_i -p_i \ln p_i \, </math>
:<math> H = \sum_i -p_i \ln p_i \, </math>


subject to the probability distribution satisfying a set of constraints (usually expectation values) corresponding to the known [[macroscopic]] quantities. Physicists call the result of applying the Gibbs algorithm the [[Gibbs distribution]] for the given constraints, most notably Gibbs's [[grand canonical ensemble]] for open systems when the average energy and the average number of particles are given. (See also ''[[Partition function (mathematics)|partition function]]'').
subject to the probability distribution ''p_i'' satisfying a set of constraints (usually expectation values) corresponding to the known [[macroscopic]] quantities. Physicists call the result of applying the Gibbs algorithm the [[Gibbs distribution]] for the given constraints, most notably Gibbs's [[grand canonical ensemble]] for open systems when the average energy and the average number of particles are given. (See also ''[[Partition function (mathematics)|partition function]]'').


In the light of [[Claude E. Shannon|Claude Shannon]]'s [[information theory]], in 1957 [[E.T. Jaynes]] re-interpreted the Gibbs algorithm as a much more general, more widely applicable inference technique, leading to the [[principle of maximum entropy]], and the [[Maximum entropy thermodynamics|MaxEnt view of thermodynamics]].
In the light of [[Claude E. Shannon|Claude Shannon]]'s [[information theory]], in 1957 [[E.T. Jaynes]] re-interpreted the Gibbs algorithm as a much more general, more widely applicable inference technique, leading to the [[principle of maximum entropy]], and the [[Maximum entropy thermodynamics|MaxEnt view of thermodynamics]].

Revision as of 22:34, 3 February 2015

In statistical mechanics, the Gibbs algorithm, introduced by J. Willard Gibbs in 1878, is a criterion for choosing a probability distribution for the statistical ensemble of microstates of a thermodynamic system by maximising the average negative log probability (or information-theoretic entropy)

subject to the probability distribution p_i satisfying a set of constraints (usually expectation values) corresponding to the known macroscopic quantities. Physicists call the result of applying the Gibbs algorithm the Gibbs distribution for the given constraints, most notably Gibbs's grand canonical ensemble for open systems when the average energy and the average number of particles are given. (See also partition function).

In the light of Claude Shannon's information theory, in 1957 E.T. Jaynes re-interpreted the Gibbs algorithm as a much more general, more widely applicable inference technique, leading to the principle of maximum entropy, and the MaxEnt view of thermodynamics.

This general result of the Gibbs algorithm is then a maximum entropy probability distribution. Statisticians identify such distributions as belonging to exponential families.

Not to be confused with

The Gibbs sampler, an update algorithm used in Markov chain Monte Carlo iterations, a special case of the Metropolis-Hastings algorithm.