The Entropy Is NOT a State Function

Dec 27, 2022
438
12
185
Visit site
"Entropy is a state function. To prove this assertion, we need to show that the integral of dS is independent of path. To do so, it is sufficient to prove that the integral of eqn 3.1 around an arbitrary cycle is zero, for that guarantees that the entropy is the same at the initial and final states of the system regardless of the path taken between them (Fig. 3.5). That is, we need to show that
[cyclic integral] dQrev/T = 0 ................................................................... (3.6)
where the symbol [cyclic integral] denotes integration around a closed path. There are three steps in the argument:
1. First, to show that eqn 3.6 is true for a special cycle (a 'Carnot cycle') involving a perfect gas.
2. Then to show that the result is true whatever the working substance.
3. Finally, to show that the result is true for any cycle.
..................................................
For the final step in the argument, we note that any reversible cycle can be approximated as a collection of Carnot cycles and the cyclic integral around an arbitrary path is the sum of the integrals around each of the Carnot cycles." https://www.studocu.com/in/document...nergetics/entropy-the-state-function/44239210

The statement

"any reversible cycle can be approximated as a collection of Carnot cycles"

is a century and a half old blatant lie.
 
Dec 27, 2022
438
12
185
Visit site
Theoretical physicists would (reluctantly) agree that, if the entropy is not a state function, then it is nonsense. Is there any valid proof that the entropy is a state function? There is none. Here is an oversimplified history of the entropy concept:

If you define the entropy S as a quantity that obeys the equation dS=dQrev/T, you will find that, so defined, the entropy is a state function FOR AN IDEAL GAS: https://socratic.org/questions/is-entropy-state-function-how-prove-it. Clausius decided to prove that the entropy (so defined) is a state function for ANY system. He based his argument on the false assumption that any cycle can be disintegrated into small Carnot cycles, and nowadays this remains the only "proof" that entropy is a state function:

"Carnot Cycles: S is a State Function. Any reversible cycle can be thought of as a collection of Carnot cycles - this approximation becomes exact as cycles become infinitesimal. Entropy change around an individual cycle is zero. Sum of entropy changes over all cycles is zero." http://mutuslab.cs.uwindsor.ca/schurko/introphyschem/lectures/240_l10.pdf

The statement "Any reversible cycle can be thought of as a collection of Carnot cycles" is a blatant lie. An isothermal cycle CANNOT be thought of as a collection of Carnot cycles, a cycle involving action of conservative forces CANNOT be thought of as a collection of Carnot cycles, etc. etc.

Conclusion: The belief that the entropy is a state function is totally unjustified. Any time scientists use the term "entropy", they don't know what they are talking about:

"The concept of entropy was introduced to thermodynamics by Clausius, who deliberately chose an obscure term for it, wanting a word based on Greek roots that would sound similar to "energy". In this way he hoped to have a word that would mean the same to everyone regardless of their language, and, as Cooper [2] remarked, he succeeded in this way in finding a word that meant the same to everyone: NOTHING. From the beginning it proved a very difficult concept for other thermodynamicists, even including such accomplished mathematicians as Kelvin and Maxwell; Kelvin, indeed, despite his own major contributions to the subject, never appreciated the idea of entropy [3]. The difficulties that Clausius created have continued to the present day, with the result that a fundamental idea that is absolutely necessary for understanding the theory of chemical equilibria continues to give trouble, not only to students but also to scientists who need the concept for their work." https://www.beilstein-institut.de/download/712/cornishbowden_1.pdf

"My greatest concern was what to call it. I thought of calling it 'information', but the word was overly used, so I decided to call it 'uncertainty'. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, 'You should call it entropy, for two reasons: In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage." https://en.wikipedia.org/wiki/History_of_entropy
 

Latest posts