Entropy Means...Nothing!

Dec 27, 2022
438
12
185
Visit site
Any thermodynamicist would (reluctantly) agree that, if the entropy is not a state function, then it is nonsense. But then the thermodynamicist would say that the entropy IS a state function and that this has been proved since Clausius. At the end the thermodynamicist might even quote Einstein and Eddington who highly valued the ideology of thermodynamics as a prototype of the ideology of relativity.

Is there any proof that the entropy is a state function? There is none. Here is the oversimplified history of the entropy concept:

If you define the entropy S as a quantity that obeys the equation dS=dQrev/T, you will find that, so defined, the entropy is a state function FOR AN IDEAL GAS: https://socratic.org/questions/is-entropy-state-function-how-prove-it Clausius decided to prove that the entropy (so defined) is a state function for ANY system. He based his proof on the assumption that any cycle can be disintegrated into small Carnot cycles, and nowadays this proof remains the only justification of "Entropy is a state function":

"Carnot Cycles: S is a State Function. Any reversible cycle can be thought of as a collection of Carnot cycles - this approximation becomes exact as cycles become infinitesimal. Entropy change around an individual cycle is zero. Sum of entropy changes over all cycles is zero." http://mutuslab.cs.uwindsor.ca/schurko/introphyschem/lectures/240_l10.pdf

The assumption on which "Entropy is a state function" is based - that any cycle can be subdivided into small Carnot cycles - is obviously false. An isothermal cycle CANNOT be subdivided into small Carnot cycles, a cycle involving the action of conservative forces CANNOT be subdivided into small Carnot cycles, etc. etc.

Conclusion: The belief that the entropy is a state function is totally unjustified. Any time scientists use the term "entropy", they don't know what they are talking about:

"My greatest concern was what to call it. I thought of calling it 'information', but the word was overly used, so I decided to call it 'uncertainty'. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, 'You should call it entropy, for two reasons: In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage." https://en.wikipedia.org/wiki/History_of_entropy

Here Sabine Hossenfelder extensively uses the term "entropy" (nobody knows what entropy really is so she sounds very convincing):

View: https://www.youtube.com/watch?v=2QnRpinVmo4&t=8s
 
Dec 27, 2022
438
12
185
Visit site
Arthur Eddington: "The law that entropy always increases—the Second Law of Thermodynamics—holds, I think, the supreme position among the laws of Nature. If someone points out to you that your pet theory of the universe is in disagreement with Maxwell’s equations—then so much the worse for Maxwell’s equations. If it is found to be contradicted by observation—well these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation." https://todayinsci.com/E/Eddington_Arthur/EddingtonArthur-Entropy-Quotations.htm

The version of the second law of thermodynamics known as "Entropy always increases" is in fact a theorem deduced by Clausius in 1865:

Jos Uffink, professor at the University of Minnesota, p. 37: "Hence we obtain: THE ENTROPY PRINCIPLE (Clausius' version) For every nicht umkehrbar [irreversible] process in an adiabatically isolated system which begins and ends in an equilibrium state, the entropy of the final state is greater than or equal to that of the initial state. For every umkehrbar [reversible] process in an adiabatical system, the entropy of the final state is equal to that of the initial state." http://philsci-archive.pitt.edu/archive/00000313/

Clausius' deduction was predicated on three postulates:

Postulate 1 (implicit): The entropy is a state function.

Postulate 2: Clausius' inequality (formula 10 on p. 33 in Uffink's article) is correct.

Postulate 3: Any irreversible process can be closed by a reversible process to become a cycle.

The three postulates remain totally unjustified even nowadays. Postulate 1 can easily be disproved by considering cycles (heat engines) converting heat into work in ISOTHERMAL conditions. Postulate 3 is repudiated even in official thermodynamics:

Uffink, p.39: "A more important objection, it seems to me, is that Clausius bases his conclusion that the entropy increases in a nicht umkehrbar [irreversible] process on the assumption that such a process can be closed by an umkehrbar [reversible] process to become a cycle. This is essential for the definition of the entropy difference between the initial and final states. But the assumption is far from obvious for a system more complex than an ideal gas, or for states far from equilibrium, or for processes other than the simple exchange of heat and work. Thus, the generalisation to all transformations occurring in Nature is somewhat rash."

Note that the theorem only holds for "an adiabatically isolated system which begins and ends in an equilibrium state". This means that, even if Clausius's postulates were true (they are not), all applications of "Entropy always increases" to processes which do not begin and end in equilibrium would still be unjustified!
 
Dec 27, 2022
438
12
185
Visit site
Athel Cornish-Bowden https://en.wikipedia.org/wiki/Athel_Cornish-Bowden: "The concept of entropy was introduced to thermodynamics by Clausius, who deliberately chose an obscure term for it, wanting a word based on Greek roots that would sound similar to "energy". In this way he hoped to have a word that would mean the same to everyone regardless of their language, and, as Cooper [2] remarked, he succeeded in this way in finding a word that meant the same to everyone: NOTHING. From the beginning it proved a very difficult concept for other thermodynamicists, even including such accomplished mathematicians as Kelvin and Maxwell; Kelvin, indeed, despite his own major contributions to the subject, never appreciated the idea of entropy [3]. The difficulties that Clausius created have continued to the present day, with the result that a fundamental idea that is absolutely necessary for understanding the theory of chemical equilibria continues to give trouble, not only to students but also to scientists who need the concept for their work." https://www.beilstein-institut.de/download/712/cornishbowden_1.pdf

Professor Jos Uffink, University of Minnesota: "I therefore argue for the view that the second law has nothing to do with the arrow of time...One of the most frequently discussed aspects of the Second Law is its relation with the ‘arrow of time’. In fact, in many texts in philosophy of physics the Second Law figures as an emblem of this arrow. The idea is, roughly, that typical thermodynamical processes are irreversible, i.e. they can only occur in one sense only, and that this is relevant for the distinction between past and future. At first sight, the Second Law is indeed relevant for this arrow. If the entropy can only increase during a thermodynamical process, then obviously, a reversal of this process is not possible. Many authors believe this is a crucial feature, if not the very essence of the Second Law. Planck, for example, claimed that, were it not for the existence of irreversible processes, ‘the entire edifice of the second law would crumble...and theoretical work would have to start from the beginning.’ (Planck 1897, §113), and viewed entropy increase as a ‘universal measure of irreversibility’...This summary leads to the question whether it is fruitful to see irreversibility or time-asymmetry as the essence of the second law. Is it not more straightforward, in view of the unargued statements of Kelvin, the bold claims of Clausius and the strained attempts of Planck, to give up this idea? I believe that Ehrenfest-Afanassjewa was right in her verdict that the discussion about the arrow of time as expressed in the second law of the thermodynamics is actually a RED HERRING." http://philsci-archive.pitt.edu/313/1/engtot.pdf
 
Dec 27, 2022
438
12
185
Visit site
The concept of entropy is not even wrong. The concept of black hole is not even wrong as well. Then their marriage must be not even not even wrong:

"The Bekenstein-Hawking entropy or black hole entropy is the amount of entropy that must be assigned to a black hole in order for it to comply with the laws of thermodynamics as they are interpreted by observers external to that black hole...In ordinary thermodynamics the second law requires that the entropy of a closed system shall never decrease, and shall typically increase as a consequence of generic transformations. While this law may hold good for a system including a black hole, it is not informative in its original form. For example, if an ordinary system falls into a black hole, the ordinary entropy becomes invisible to an exterior observer, so from her viewpoint, saying that ordinary entropy increases does not provide any insight: the ordinary second law is transcended. Including the black hole entropy in the entropy ledger gives a more useful law, the generalized second law of thermodynamics (GSL) (Bekenstein 1972, 1973, 1974): the sum of ordinary entropy So outside black holes and the total black hole entropy never decreases and typically increases as a consequence of generic transformations of the black hole. In equations ΔSo + ΔSBH ≥ 0." http://www.scholarpedia.org/article/Bekenstein-Hawking_entropy
 
Dec 12, 2022
30
6
35
Visit site
Well OK,

But one should admit that the concept of entropy works well for engineering designs. I have designed some engines and wouldn't throw out the use of entropy. A T-S chart is a very good tool to have around.

That said, I found what I believe is a limit to a classical concept of entropy, at least in dealing with gasses and vapors. Here entropy is a macro concept but it seems to break down in some circumstances on a micro level.

Follow with the pictures in your mind, if you will.

You know what Brownian movement is, where small particles and molecules that are warm move around and kick into each other causing movement. The same with or in gasses where the molecules bump into each other. The impact or force that these molecules bump into each other with is related to both temperature and pressure.

Classically an engine like a Carnot needs both a hot source and a cold sink to operate and produce any useful work.

Both the hot source and the cold sink along with the engine can be isolated with only a turning shaft sticking out to measure the "useful" work. When all is done and the engine grinds to a halt there will be a measured amount of work done and the hot source and the cold sink temperatures will be in equilibrium and as far as efficiency the Carnot engine is as good as you can do.

Now though picture a recently made nano device. It is a very small, of course, little thing. It is a ratchet with a shaft sticking out of it and it only turns one way. It is small enough and open on one side to where a molecule can collide with the ratchet gear and turn it, but only one way.

That shaft is what is sticking out of our isolated box and the work it produces can be measured. The little system is extracting work out of that box of warm gas and as it does the temperature inside is dropping. Hot source only, no cold sink. It will continue doing so until the temperature is so low that the colliding gas molecules have insufficient force to turn the ratchet.

Conservation of energy isn't broken. Nope, but the "engine" produces "useful" work and the entropy of the little isolated box is decreased. Hot and cold remain unmixed and work was done.
 
  • Like
Reactions: Garret Moore

Latest posts