Jump to content

Introduction to entropy

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Retired username (talk | contribs) at 01:56, 28 October 2006. The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Template:Linkless

The concept of entropy is central to the second law of thermodynamics, which deals with physical processes and whether they occur spontaneously, and in a general sense says that temperature differences between systems in contact with each other tend to even out and that work can be obtained from these non-equilibrium differences, but that loss of heat occurs, in the form of entropy, when work is done. The concept of energy is central to the first law of thermodynamics, which deals with the conservation of energy and under which the loss in heat will result in an increase in the internal energy of the thermodynamic system. Thermodynamic entropy provides a comparative measure of the amount of this internal energy. A simple and more concrete visualisation of the second law is that energy of all types changes from being localized to becoming dispersed or spread out, if it is not hindered from doing so. Entropy change is the quantitative measure of that kind of a spontaneous process: how much energy has flowed or how widely it has become spread out at a specific temperature.

Entropy describes any of several phenomena, depending on the field and the context in which it is being used.

Originally, entropy was named to describe the "waste heat" from engines and other mechanical devices. Later, the term came to acquire several additional descriptions as more came to be understood about the behavior of molecules on the microscopic level. In the late 19th century the word "disorder" was used by Ludwig Boltzmann to describe the increased molecular movement on the microscopic level. That was before quantum behavior came to be better understood by Werner Heisenberg and those who followed. For most of the 20th century textbooks tended to describe entropy as "disorder", following Boltzmann. More recently there has been a trend in chemistry and physics textbooks to describe entropy in terms of "dispersal of energy".

Other disciplines, such as chemistry, have taken entropy to also mean the dispersal of particles. Thus there are instances where both particles and energy disperse at different rates when substances are mixed together. Yet other disciplines such as information sciences have developed concepts such as information entropy. Thermodynamic (heat) entropy may also involve statistical mechanics, which describes entropy on the microscopic level.

Technical descriptions of entropy

Traditionally, 20th century chemistry textbooks have described entropy as "a measurement of the disorder or randomness of a system".[citation needed] Modern chemistry textbooks have increasingly tended to say: "Entropy measures the spontaneous dispersal of energy... — at a specific temperature."[citation needed]

In thermodynamics, entropy is one of the three basic thermodynamic potentials U (internal energy), S (entropy) and F (Helmholtz free energy).

Entropy increase as energy dispersal

Where heat entropy involves a heat source, there is a continuous flow of that energy through the surrounding physical structure or space. From a technical standpoint, entropy is defined as a measurement of the heat energy in a given place or in a defined space, at a given instant in time, (identified as the quantity S ). A change in this measurement is called "Delta-S" (). Where a heat generating source (one that produces heat in excess of the rest of the defined space) is not involved, entropy within a defined space involves a movement of existing energy and/or particles towards a steady state, known as equilibrium.

The description of entropy as amounts of "mixedupness" or "disorder" and the abstract nature of statistical mechanics can lead to confusion and considerable difficulty for students beginning the subject.[1][2] An approach to instruction emphasising the qualitative simplicity of entropy has been developed.[3] In this approach, entropy increase is described the spontaneous dispersal of energy: how much energy is spread out in a process or how widely spread out it becomes at a specific temperature. A cup of hot coffee in a room will eventually cool, and the room will become a bit warmer. The higher amount of heat energy in the hot coffee has been dispersed to the entire room, and the net entropy of the system (coffee and room) will increase as a result.

In this approach the statistical interpretation is related to quantum mechanics, and the generalization is made for molecular thermodynamics that "Entropy measures the energy dispersal for a system by the number of accessible microstates, the number of arrangements (each containing the total system energy) for which the molecules' quantized energy can be distributed, and in one of which – at a given instant – the system exists prior to changing to another.[4] On this basis the claim is made that in all everyday spontaneous physical happenings and chemical reactions, some type of energy flows from being localized or concentrated to becoming spread out — often to a larger space, always to a state with a greater number of microstates.[3] Thus in situations such as the entropy of mixing it is not strictly correct to visualise a spatial dispersion of energy; rather the molecules' motional energy of each constituent is actually spread out in the sense of having the chance of being, at one instant, in any one of many many more microstates in the larger mixture than each had before the process of mixing.[4]

The subject remains subtle and difficult, and in complex cases the qualitative relation of energy dispersal to entropy change can be so inextricably obscured that it is moot. However, it is claimed that this does not weaken its explanatory power for beginning students.[3] In all cases however, the statistical interpretation will hold - entropy increases as the system moves from a macrostate which is very unlikely (for a system in equilibrium), to a macrostate which is much more probable.

Notes and references