Gibbs algorithm
![]() | This article's factual accuracy is disputed. (March 2008) |
In statistical mechanics, the Gibbs algorithm, introduced by J. Willard Gibbs in 1878, is a criterion for choosing a probability distribution for the statistical ensemble of microstates of a thermodynamic system by maximising the average negative log probability (or information-theoretic entropy)
subject to the probability distribution p_i satisfying a set of constraints (usually expectation values) corresponding to the known macroscopic quantities. Physicists call the result of applying the Gibbs algorithm the Gibbs distribution for the given constraints, most notably Gibbs's grand canonical ensemble for open systems when the average energy and the average number of particles are given. (See also partition function).
In the light of Claude Shannon's information theory, in 1957 E.T. Jaynes re-interpreted the Gibbs algorithm as a much more general, more widely applicable inference technique, leading to the principle of maximum entropy, and the MaxEnt view of thermodynamics.
This general result of the Gibbs algorithm is then a maximum entropy probability distribution. Statisticians identify such distributions as belonging to exponential families.
Not to be confused with
The Gibbs sampler, an update algorithm used in Markov chain Monte Carlo iterations, a special case of the Metropolis-Hastings algorithm.