Talk:Second law of thermodynamics/Archive 3
![]() | Chemistry NA‑class | ||||||
|
Archives
- Talk:Second law of thermodynamics/Archive1 '05-(Mar)'06
- Talk:Second law of thermodynamics/creationism (Nov 2005- Feb 2006)
Cut sloppy definitions
I cut the following ("consumed" was a word used back in the 17th century):
- If thermodynamic work is to be done at a finite rate, free energy must be consumed [1]
- The entropy of a closed system will not decrease for any sustained period of time (see Maxwell's demon)
- A downside to this last description is that it requires an understanding of the concept of entropy. There are, however, consequences of the second law that are understandable without a full understanding of entropy. These are described in the first section below.
For such an important & confusing law in science, we need textbook definitions with sources.--Sadi Carnot 04:58, 3 March 2006 (UTC)
- I don't understand why these two were selected for removal and no textbook definitions with sources were added. One way to make things clear to the general reader is to limit the way the law is expressed to a very strict definition used in modern physics textbooks. A second way is to express the law multiple times so that maybe one or two of them will "stick" in the mind of a reader with a certain background. I think the second approach is better. The word "consumed" was used and referenced to an article from 2001 about biochemical thermodynamics by someone at a department of surgery. When the second law is applied to metabolism, the word "consumed" seems to me to be current and appropriate. Flying Jazz 12:49, 3 March 2006 (UTC)
I don't understand why the entropy definition was deleted. It's not wrong or confusing. -- infinity0 16:17, 3 March 2006 (UTC)
- Regarding the first statement, in 1824 Sadi Carnot, the originator of the 2nd Law mind you, states: “The production of motive power is then due in steam-engines not to an actual consumption of the caloric, but to its transportation from a warm body to a cold body, that is to its re-establishment of equilibrium.” Furthermore, for the main statements of the 2nd Law we should be referencing someone from a department of engineering not someone from the department of surgery; certainly this reference is good for discussion, but not as principle source for one of the grandest laws in the history of science. Regarding the second statement, to within an approximation, the earth delineated at the Karman line can be modeled as a closed system, i.e. heat flows across the boundary daily in a significant manner, but matter, aside for the occasional asteroid input and negligible atmospheric mass loss, does not. Hence, entropy has decreased significantly over time. This model contradicts the above 2nd Law statement. Regarding the layperson viewpoint, certainly it is advisable to re-word things to make them stick, but when it comes to laws precise wording is a scientific imperative.--Sadi Carnot 18:37, 3 April 2006 (UTC)
New "definition"
User talk:24.93.101.70 keeps inserting this in:
- If thermodynamic work is to be done at a finite rate, free energy must be consumed. (This statement is based on the fact that free energy can be accurately defined as that portion of any First-Law energy that is available for doing thermodynamic work; i.e., work mediated by thermal energy. Since free energy is subject to irreversible loss in the course of thermodynamic work and First-Law energy is not, it is evident that free energy is an expendable, Second-Law kind of energy.)
This definition is neither succinct nor clear. Also, it's very confusing and very very long winded, and it makes no sense (to me, and therefore the reader too). "First-Law" energy? What's that?? -- infinity0 22:41, 21 March 2006 (UTC)
- Agree, replaced the word "consumed" accordingly; 1st Law = "Energy is Conserved" (not consumed).--Sadi Carnot 19:12, 3 April 2006 (UTC)
Mathematical definition
I tweaked the mathematical definition a little, which probably isn't the best way to do it. However I was thinking that a much more accurate mathematical description would be:
"As the time over which the average is taken goes to ∞:
where
- is the instantanious change in entropy per time"
Comments? Fresheneesz 10:43, 4 April 2006 (UTC)
- I feel this is a worth while point; however, it should go in a separate "header" section (such as: Hypothetical violations of 2nd Law) and there should be reference to such articles as: 2nd Law Beads of Doubt. That is, the main section should be clear and strong. Esoteric deviations should be discussed afterwards. --Sadi Carnot 15:54, 4 April 2006 (UTC)
- If you correctly state the second law such that the entropy increases on average, then there are no violations of the 2nd law. (The terminology used in your reference is dodgy.) That the entropy can sometimes go down is no longer hypothetical, since transient entropy reductions have been observed. Really there should be links to the Fluctuation theorem and Jarzynski equality, since these are, essentially, new and improved, cutting edge, extensions of the second law. Nonsuch 19:18, 4 April 2006 (UTC)
- The idea that the entropy increases on the average (Not all of the time) is very old, and should be included. It is not necessary to average over infinite time. On average, the entropy increases with time, for any time interval. The reason that this point is normally glossed over is that for a macroscopic system we can generally ignore the differences between average and instantaneous values of energy and entropy. But these days we can do experimental thermodynamics on microsopic systems. Nonsuch 19:18, 4 April 2006 (UTC)
See also "Treatise with Reasoning Proof of the Second Law of Energy Degradation"
"Treatise with Reasoning Proof of the Second Law of Energy Degradation: The Carnot Cycle Proof, and Entropy Transfer and Generation," by Milivoje M. Kostic, Northern Illinois University http://www.kostic.niu.edu/Kostic-2nd-Law-Proof.pdf and http://www.kostic.niu.edu/energy
“It is crystal-clear (to me) that all confusions related to the far-reaching fundamental Laws of Thermodynamics, and especially the Second Law, are due to the lack of their genuine and subtle comprehension.” (Kostic, 2006).
There are many statements of the Second Law which in essence describe the same natural phenomena about the spontaneous direction of all natural processes towards a stable equilibrium with randomized redistribution and equi-partition of energy within the elementary structure of all interacting systems (thus the universe). Therefore, the Second Law could be expressed in many forms reflecting impossibility of creating or increasing non-equilibrium and thus work potential between the systems within an isolated enclosure or the universe:
1. No heat transfer from low to high temperature of no-work process (like isochoric thermo-mechanical process).
2. No work transfer from low to high pressure of no-heat process (adiabatic thermo-mechanical process).
3. No work-producing from a single heat reservoir, i.e., no more efficient work-producing heat engine cycle than the Carnot cycle.
4. Etc, etc … No creation or increase of non-equilibrium and thus work potential, but only decrease of work potential and non-equilibrium towards a common equilibrium (equalization of all energy-potentials) accompanied with entropy generation due to loss of work potential at system absolute temperature, resulting in maximum equilibrium entropy.
All the Second Law statements are equivalent since they reflect equality of work potential between all system states reached by any and all reversible processes (reversibility is measure of equivalency) and impossibility of creating or increasing systems non-equilibrium and work potential.
About Carnot Cycle: For given heat reservoirs’ temperatures, no other heat engine could be more efficient than a (any) reversible cycle, since otherwise such reversible cycle could be reversed and coupled with that higher efficiency cycle to produce work permanently (create non-equilibrium) from a single low-temperature reservoir in equilibrium (with no net-heat-transfer to the high-temperature reservoir). This implies that all reversible cycles, regardless of the cycle medium, must have the same efficiency which is also the maximum possible, and that irreversible cycles may and do have smaller, down to zero (no net-work) or even negative efficiency (consuming work, thus no longer power cycle). Carnot reasoning opened the way to generalization of reversibility and energy process equivalency, definition of absolute thermodynamic temperature and a new thermodynamic material property “entropy,” as well as the Gibbs free energy, one of the most important thermodynamic functions for the characterization of electro-chemical systems and their equilibriums, thus resulting in formulation of the universal and far-reaching Second Law of Thermodynamics. It is reasoned and proven here that the net-cycle-work is due to the net-thermal expansion-compression, thus necessity for the thermal cooling and compression, since the net-mechanical expansion-compression is zero for any reversible cycle exposed to a single thermal reservoir only.
In conclusion, it is only possible to produce work during energy exchange between systems in non-equilibrium. Actually, the work potential (maximum possible work to be extracted in any reversible process from that systems' non-equilibrium to the common equilibrium) is measure of the systems’ non-equilibrium, thus the work potential could be conserved only in processes if the non-equilibrium is preserved (conserved, i.e. rearranged), and such ideal processes could be reversed. When the systems come to the equilibrium there is no potential for any process to produce (extract) work. Therefore, it is impossible to produce work from a single thermal reservoir in equilibrium (then non-equilibrium will be spontaneously created leading to a “black-hole-like energy singularity,” instead to the equilibrium with randomized equi-partition of energy). It is only possible to produce work from thermal energy in a process between two thermal reservoirs in non-equilibrium (with different temperatures). Maximum work for a given heat transfer from high to low temperature thermal reservoir will be produced during ideal, reversible cyclic process, in order to prevent any other impact to the surrounding, (like net-volume expansion, etc.; net-cyclic change is zero). All real natural processes between systems in non-equilibrium have tendency towards common equilibrium and thus loss of the original work potential, by converting other energy forms into the thermal energy accompanied with increase of entropy (randomized equi-partition of energy per absolute temperature level). Due to loss of work potential in a real process, the resulting reduced work cannot reverse back the process to the original non-equilibrium, as is possible with ideal reversible processes. Since non-equilibrium cannot be created or increased spontaneously (by itself and without interaction with the rest of the surroundings) then all reversible processes must be the most and equally efficient (will equally conserve work potential, otherwise will create non-equilibrium by coupling with differently efficient reversible processes). The irreversible processes will loose work potential to thermal energy with increase of entropy, thus will be less efficient than corresponding reversible processes … this will be further elaborated and generalized in other Sections of this “Treatise with Reasoning Proof of the Second Law of Energy Degradation,” -- the work is in final stage to be finished soon.
http://www.kostic.niu.edu and http://www.kostic.niu.edu/energy
- Kostic, what is the point of all this verbiage? Why is it at the top of the talk page? Why does it make no sense? Nonsuch 04:20, 12 April 2006 (UTC)
Miscellany - Cables in a box
This description of the 2nd Law was quoted in a slightly mocking letter in the UK The Guardian newspaper on Wednesday 3rd May 2006. It has been in the article unchanged since 12 September 2004 and at best needs editing if not deleting. To me it adds little in the way of clarity. Malcolma 11:50, 4 May 2006 (UTC)
Why not a simple statement ?
I would propose to add the following statement of the 2nd law to the "general description" section, or even the intro : "Heat cannot of itself pass from a colder to a hotter body. " (Clausius, 1850). Unlike all others, this statement of the law is understandable by everybody: any reason why it's not in the article ? Pcarbonn 20:22, 6 June 2006 (UTC)
- Excellent idea! LeBofSportif 20:44, 6 June 2006 (UTC
Order in open versus closed systems
Propose adding the following section:
Order in open versus closed systems Granville Sewell has developed the equations for entropy change in open versus closed systems.[1] [2] He shows. e.g., St >= - integral of heat flux vector J through the boundary.
In summary:
- Order cannot increase in a closed system.
- In an open system, order cannot increase faster than it is imported through the boundary.
Sewell observes:
The thermal order in an open system can decrease in two ways -- it can be converted to disorder, or it can be exported through the boundary. It can increase in only one way: by importation through the boundary.[3]
Added link to Sewell's appendix D posted on his site. Sewell, Granville (2005) Can "ANYTHING" Happen in an Open System? And his key equation D.5 ( ie St >= - integral of heat flux vector J through the boundary.) This is a key formulation that is not shown so far in the Wiki article. I will add the explicit equations especially D.4 and D.5 DLH 14:05, 13 July 2006 (UTC)
- Notable? FeloniousMonk 05:54, 13 July 2006 (UTC)
- Not highly, I should say, but he most certainly has a clear bias: idthefuture.
- Besides, I'm not sure what the relevance of talking about a closed system is as it's inapplicable to Earth; and the open system quotation sounds like words in search of an idea.
- In any case, this is yet another example of a creationist trying to manipulate 2LOT to support his conclusion, generally by misrepresenting both 2LOT and evolution. To wit:
- "The evolutionist, therefore, cannot avoid the question of probability by saying that anything can happen in an open system, he is finally forced to argue that it only seems extremely improbable, but really isn’t, that atoms would rearrange themselves into spaceships and computers and TV sets."page 4
- •Jim62sch• 09:35, 13 July 2006 (UTC)
- Please address Sewell's derivation of the heat flux through the boundary, equations D.1-D.5, as I believe they still apply.See: Sewell, Granville (2005) Can "ANYTHING" Happen in an Open System?DLH 14:46, 13 July 2006 (UTC)
- A mainstream way of treating DLH's point about open systems (which perhaps should be reviewed in the article) is that
- in any process the entropy of the {system + its surroundings} must increase.
- Entropy of the system can decrease, but only if that amount of entropy or more is exported into the surroundings. For example, think of a lump of ice growing in freezing-cold water. The entropy of the water molecules freezing in to the lump of ice fall; but this is more than offset by the energy released by the process (the latent heat), warming up the remaining liquid water. Such reactions, with a favourable ΔH, unfavourable ΔSsystem, but overall favourable ΔG, are very common in chemistry.
- A mainstream way of treating DLH's point about open systems (which perhaps should be reviewed in the article) is that
- I'm afraid Sewell's article is a bit bonkers. There's nothing in the second law against people doing useful work (or creating computers etc), "paid for" by a degradation of energy. Similarly there's nothing in the second law against cells doing useful work (building structures, or establishing ion concentration gradients etc), "paid for" by a degradation of energy.
- Of course, Sewell has a creationist axe to grind. Boiled down, his argument is not really about the Second Law, by which all the above processes are possible. Instead, what I think he's really trying to drive towards is the notion that we can distinguish the idea of 'a process taking advantage of the possibilities of such energy degradation in an organised way to perpetuate its own process' as a key indicator of "life"; and then ask the question, if we understand "life" in such terms, is a transition from "no life" to "life" a discontinuous qualitative step-change, which cannot be explained by a gradualist process of evolution?
- This is an interesting question to think about; but the appropriate place to analyse it is really under abiogenesis. It doesn't have a lot to do with the second law of thermodynamics. For myself, I think the creationists are wrong. I think we can imagine simple boundaries coming into being naturally, creating a separation between 'system' and 'surroundings' - for example, perhaps in semi-permeable clays, or across a simple lipid boundary; and then I think we can imagine there could be quite simple chemical positive feedback mechanisms that could make use of eg external temperature gradients to reinforce that boundary; and the whole process of gradually increasing complexity could pull itself up from its own bootstraps from there.
- But that's not really an argument for here. The only relevant thing as regards the 2nd law is that the 2nd law does not forbid increasing order or structure or complexity in part of a system, so long as it is associated with an increase in overall entropy. -- Jheald 10:05, 13 July 2006 (UTC).
- Putting aside whatever ad hominem by association arguments you have against Sewell, please address the proposed statement summarizing Sewell's formulation of the entropy vs flux through the boundary, and the links provided. DLH 14:05, 13 July 2006 (UTC)DLH 14:23, 13 July 2006 (UTC)
Scientifically Based
Removed religious comments in article, please let us keep these articles scientifically based, I have never opened a textbook and come across those lines in any section on the second law. Physical Chemist 19:26, 18 July 2006 (UTC)
References
- ^ Sewell, Granville (2005). The numerical Solution of Ordinary and Partial Differential Equations, 2nd Edition,. ISBN 0471735809.
{{cite book}}
: Text "John Wiley & Sons" ignored (help); Text "Publisher" ignored (help) Appendix D. - ^ Sewell, Granville (2005) Can "ANYTHING" Happen in an Open System?
- ^ Sewell, Granville (2005) A Second Look at the Second Law
maximize energy degradation ?
The article currently says: " [The system tends to evolve towards a structure ...] which very nearly maximise the rate of energy degradation (the rate of entropy production". I've added a request for source for this, and it would be interesting to expand this statement further (in this or other related articles; I could not find any description of this statement in wikipedia). Pcarbonn 11:16, 25 July 2006 (UTC)
- Prigogine and co-workers first proposed the Principle of Maximum Entropy Production in the 1940s and 50s, for near-equilibrium steady-state dissipative systems. This may or may not extend cleanly to more general further-from-equilibrium systems. Cosma Shalizi quotes P.W. Anderson (and others) being quite negative about it [2]. On the other hand, the Lorenz and Dewar papers cited at the end of Maximum entropy thermodynamics seem much more positive, both about how the principle can be rigorously defined and derived, and about how it does appear to explain some real quantitative observations. Christian Maes has also done some good work in this area, which (ISTR) says it does pretty well work for the right definition of "entropy production", but there may be some quite subtle bodies buried.
- A proper response to you probably Needs An Expert; I'm afraid my knowledge is fairly narrow in this area, and not very deep. But hope the paragraph above gives a start, all the same. Jheald 12:23, 25 July 2006 (UTC).
- Note also, just to complicate things,that in effect there's a minimax going on, so Maximum Entropy Production with respect to one set of variables can also be Minimum Entropy Production (as discussed at the end of the Dewar 2005 paper, a bit mathematically). It all depends what variables you're imagining holding constant, and what ones can vary (and what Legendre transformtions you've done). In fact it was always "Minimum" Entropy Production that Prigogine talked about; but re-casting that in MaxEnt terms is probably a more general way of thinking about what's going on. Jheald 13:09, 25 July 2006 (UTC).
Air conditioning?
I asked this question on the AC page, but didn't get an answer, but how does air conditioning NOT violate the 2nd law? Isn't it taking heat from a warm enviroment (inside of a building), and transfering it to a hotter enviroment (outside the building)? Doesn't this process in effect reverse the effects of entropy? Inforazer 13:19, 13 September 2006 (UTC)
- The AC unit also plugs into the wall. Drawing electricity and converting it to heat increases entropy. Hope this helps. 192.75.48.150 14:44, 13 September 2006 (UTC)
- That's right - The entropy of a CLOSED SYSTEM always increases. A closed system is one in which no material or energy goes in or out. You have electricity flowing into your room and besides, the air conditioner is dumping heat to the outside. The entropy loss you experience in the cool room is more than offset by the entropy transferred to the outside and the entropy gain at the electrical power plant. PAR 15:51, 13 September 2006 (UTC)
- See heat pump for further more detailed analysis Jheald 17:15, 13 September 2006 (UTC).
Fridenfalk's Paradox
I have lately found a mathematical proof that contradicts the second law of thermodynamics. This proof is new and was recently published by a university as an official document. The paper has been reviewed by several experts in the field and no errors were found. Would it be permissible for me to write a note about it under Criticisms with a link to a brief summary? Since this mathematical proof contradicts the second law of thermodynamics and since the proof is extremely simple, it has been called a "paradox" by one of the reviewers. --Hyperkraft 19:48, 15 September 2006 (UTC)
- Why not put it here on the talk page, and then if it seems worthwhile to people, put it in the article. Normally, this would not be the routine, but disproving the second law is like saying "can I put a description of a perpetual motion machine in the conservation of energy article?". Let's look at it first. PAR 22:18, 15 September 2006 (UTC)
- Here it comes: Fridenfalk's Paradox and The Principal of Inhomogeneous Equilibrium. The full publications may be found on http://www.hgo.se/game/wiki/index.php?title=User:Mikael#Internal_Reports. The publications are official documents and may be directly ordered from Gotland University in Sweden as a hard copy. The theory is almost two years old, but was made official a few days ago, why there are still no Google hits.--Hyperkraft 16:54, 16 September 2006 (UTC)
- Do not put this on the page. It is a perpetual motion machine, and needs to be verified by much more than just computer simulations. The references you quote are not from refereed journals. The machine involves a pinball inside of a specially shaped box, and relies on inelastic collisions of the pinball to generate energy. I don't want to burst your bubble, but its almost certain that some aspect of the analysis is wrong. I will try to figure it out, but I am wondering if you know enough about the machine to explain it? PAR 18:18, 16 September 2006 (UTC)
- Sure, it can definitely wait. I think that the important thing is the mathematical proof (The Principal of Inhomogeneous Equilibrium). The simulation is just a pedagogic tool. To be short it seems that "local thermodynamics equilibrium" which implies a mean energy exchange of zero between particles and surfaces creates a collision mapping that is highly non-uniform in nature. This together with the Principal of Inhomogeneous Equilibrium which is a mathematical proof seems to contradict the second law of thermodynamics. A professor in thermodynamics I recently spoke with implied that it is highly possible that Principal of Inhomogeneous Equilibrium could actually be valid, since it is a mathematical proof. But in that case the average collision mapping of particles with surfaces must be perfectly uniform for the second law of thermodynamics to hold. According to our present knowledge, the particle-surface collision mapping is highly non-uniform (thermal energy exchange at particle-surface collisions is a quantum mechanical effect), but if it would hypothetically be perfectly uniform (which would be a scientific sensation in itself), in that case it would be balancing on a mathematical singularity point, which is a very unstable situation.--Hyperkraft 22:02, 16 September 2006 (UTC)
Your reversal of entropy paper was funny; I assume this is your summary:
- “The second law of thermodynamics is invalid because spontaneous pressure gradients, in nature, can be generated and used for energy production.”
I’d like to see you define disentropic energy? Just poking some fun at it. Later: --Sadi Carnot 15:06, 17 September 2006 (UTC)
- Yes, it is kind of funny, but since the theory of disentropic energy is a direct consequence (which can be shown mathematically) of the mathematical proof The Principal of Inhomogeneous Equilibrium, I think it is a paradox that is worthy of investigation. This theory is probably the simplest theory yet in this field, yet no one seems to be able to find the flaw. Sadi Carnot, perhaps you can be of assistance here to review it? That would have been most appreciated. One of the largest companies in the world has as since a few weeks ago assembled a scientific committee to explore this paradox. This committee has not yet found any flaws in the theory. A large number of scientists have additionally tried this theory without finding any flaws. The strength of paradox seems to be its extreme simplicity. --Hyperkraft 19:07, 17 September 2006 (UTC)
PAR, I can try to explain the machine. It seems that non-uniform collision mapping (which occurs at thermal energy exchange) tend to make some directions more probable than others. That means that a net motion is created for a ball in a cabinet according to Fig. 1 in the document “Fridenfalk’s Paradox”. If the ball would have been an atom, hitting the turbine blades would have in average decreased the kinetic energy of the atom, producing work. Thus the temperature of the gas would have decreased by a small amount, but this temperature would have been again restored by atom-surface collisions (equalizing the temperature). This is the whole idea of transferring thermal energy to kinetic energy with 100% efficiency, which is basically the essence of this paradox.--Hyperkraft 18:59, 17 September 2006 (UTC)
- Please don't use Wikipedia talk pages as physics discussion forum. They should be used to co-ordinate the enhancement of the article, which by WP:NOR, WP:V and WP:RS can only done using reliable, published sources. Not your own research --Pjacobi 19:15, 17 September 2006 (UTC)
- The two attached documents above are not the published papers, but only a clarification and summary of published papers. Fridenfalk's Paradox describes results within officially published materials by an accredited university (since a few days ago, why it is still not on Google) and has been reviewed by many more expert reviewers than usually takes to clear an ordinary paper in this field. The only problem here is that the results is controversial, but that is why it is called a "paradox". A good paradox should always be controversial. So, should we wait half a year until this paradox is published by other sources as well and then put it on the page or should we do it now? This is the question. Please let me know what you think. I think Fridenfalk's Paradox satisfies all Wikipedia criteria for publication. There is no scientist who disagrees that a mathematical proof that is in conflict with the second law of thermodynamics is an obvious paradox (consensus rule), this paradox has been investigated for nearly two years (thus not new), has been extensively reviewed by many experts and finally published as an official document (thus is possible to refer to) by an institution that is at least as reliable as most scientific journals. So I think we should go ahead and publish it now, but that is my vote. --Hyperkraft 19:35, 17 September 2006 (UTC)
Provide a citation right now or stop. --Pjacobi 20:04, 17 September 2006 (UTC)
- Citation: http://www.hgo.se/game/wiki/index.php?title=User:Mikael#Internal_Reports. Look under: Internal Reports. The term Internal Report may be confusing since this is an officially published document by now. In addition, one of these documents is already available as official governmental publication by the university and may be requested from www.hgo.se as PDF or hardcopy (without charge as far as I know). Another is scheduled for release as offcial publication tomorrow by Monday and may be ordered the same way. This should satisfy the Wikipedia criteria, but if it does not, no hard feelings. --Hyperkraft 20:39, 17 September 2006 (UTC)
- O.K., thanks for the reference. [3] is not a reliable source. An even if it were, this stuff can be ignored by precedent a policy. Once it get's published and someone in the field starts bothering, citing and commenting it, you can bring up the issue again. --Pjacobi 20:44, 17 September 2006 (UTC)
- No problem at all. Sorry to break any Wikipedia rule, I am a new user. By the way, can you please specify what is meant by "this stuff can be ignored by precedent a policy"? Thanks. --Hyperkraft 20:50, 17 September 2006 (UTC)
A Tendency, Not a Law
I cut the following false part, being that it is a mathematical fact that when two systems of unequal temperature are adjoined, heat will flow between the two of them, then at thermodynamic equilibrium ΔS will be positive (see: entropy). Hence, the 2nd law is a mathematical consequence of the definition S = Q/T.--Sadi Carnot 14:55, 17 September 2006 (UTC):
- The second law states that there is a statistical tendency for entropy to increase. Expressions of the second law always include terms such as "on average" or "tends to". The second law is not an absolute rule. The entropy of a closed system has a certain amount of statistical variation, it is never a smoothly increasing quantity. Some of these statistical variations will actually cause a drop in entropy. The fact that the law is expressed probabilistically causes some people to question whether the law can be used as the basis for proofs or other conclusions. This concern is well placed for systems with a small number of particles but for everyday situations, it is not worth worrying about.
- I restored the above paragraph, not because I think it deserves to be in the article, but it certainly should not be deleted for being false. The above statements are not false. The second law is not absolutely, strictly true at all points in time. The explanation of the second law is given by statistical mechanics, and that means there will be statisitical varitions in all thermodynamic parameters. The pressure in a vessel at equilibrium is not constant. It is the result of particles randomly striking the walls of the vessel, and sometimes a few more will strike than the average, sometimes a few less. The same is true of every thermodynamic quantity. The variations are roughly of the order of 1/√N where N is the number of particles. For macroscopic situations, this is something like one part in ten billion - not worth worrying about, but for systems composed of, say, 100 particles it can be of the order of ten percent. PAR 15:15, 17 September 2006 (UTC)
Par, I assume you are referring to the following article: Second law of thermodynamics “broken”. In any case, the cut paragraph is not sourced and doesn’t sound anything like this article; the cut paragraph seems to contradict the entire Wiki article, essentially stating that the second law is not a law. If you feel it needs to stay in the article, I would think that it needs to be sourced at least twice and cleaned.
Furthermore, both the cut section and the article I linked two and the points you are noting are very arguable. The basis of the second law derives from the following set-up, as detailed by Carnot (1824), Clapeyron (1832), and Clausius (1854):
With this diagram, we can calculate the entropy change ΔS for the passage of the quantity of heat Q from the temperature T1, through the "working body" of fluid (see heat engine), which was typically a body of steam, to the temperature T2. Moreover, let us assume, for the sake of argument, that the working body contains only two molecules of water.
Next, if we make the assignment:
Then, the entropy change or "equivalence-value" for this transformation is:
which equals:
and by factoring out Q, we have the following form, as was derived by Clausius:
Thus, for example, if Q was 50 units, T1 was 100 degrees, and T2 was 1 degree, then the entropy change for this process would be 49.5. Hence, entropy increased for this process, it did not “tend” to increase. For this system configuration, subsequently, it is an "absolute rule". This rule is based on the fact that all natural processes are irreversible by virtue of the fact that molecules of a system, for example two molecules in a tank, will not only do external work (such as to push a piston), but will also do internal work on each other, in proportion to the heat used to do work (see: Mechanical equivalent of heat) during the process. Entropy accounts for the fact that internal inter-molecular friction exists.
Now, this law has been solid for 152 years and counting. Thus, if, in the article, you want to say “so and so scientist argues that the second law is invalid…” than do so and source it and all will be fine. But to put some random, supposedly invalid and un-sourced, argument in the article is not a good idea. It will give the novice reader the wrong opinion or possibly the wrong information. --Sadi Carnot 16:55, 17 September 2006 (UTC)
- I think that the second law of thermodynamics is definitely supposed to be a law, considering that it is the convergence to a final value that is of interest here, not intrinsic statistical variations. There are three possible cases. Convergence over time (for an isolated system etc.) to a state of (1) lower entropy, (2) constant entropy, (3) higher entropy. The second law of thermodynamics states that only (2) and (3) are possible. If it would have stated that all (1), (2) and (3) are possible, just that (2) and (3) are more probable than (1), it would have probably not been taken seriously by contemporary scientists. --Hyperkraft 18:43, 17 September 2006 (UTC)