Any thermodynamicist would (reluctantly) agree that, if the entropy is not a state function, then it is nonsense. But then the thermodynamicist would say that the entropy IS a state function and that this has been proved since Clausius. At the end the thermodynamicist might even quote Einstein and Eddington who highly valued the ideology of thermodynamics as a prototype of the ideology of relativity.
Is there any proof that the entropy is a state function? There is none. Here is the oversimplified history of the entropy concept:
If you define the entropy S as a quantity that obeys the equation dS=dQrev/T, you will find that, so defined, the entropy is a state function FOR AN IDEAL GAS: https://socratic.org/questions/is-entropy-state-function-how-prove-it Clausius decided to prove that the entropy (so defined) is a state function for ANY system. He based his proof on the assumption that any cycle can be disintegrated into small Carnot cycles, and nowadays this proof remains the only justification of "Entropy is a state function":
"Carnot Cycles: S is a State Function. Any reversible cycle can be thought of as a collection of Carnot cycles - this approximation becomes exact as cycles become infinitesimal. Entropy change around an individual cycle is zero. Sum of entropy changes over all cycles is zero." http://mutuslab.cs.uwindsor.ca/schurko/introphyschem/lectures/240_l10.pdf
The assumption on which "Entropy is a state function" is based - that any cycle can be subdivided into small Carnot cycles - is obviously false. An isothermal cycle CANNOT be subdivided into small Carnot cycles, a cycle involving the action of conservative forces CANNOT be subdivided into small Carnot cycles, etc. etc.
Conclusion: The belief that the entropy is a state function is totally unjustified. Any time scientists use the term "entropy", they don't know what they are talking about:
"My greatest concern was what to call it. I thought of calling it 'information', but the word was overly used, so I decided to call it 'uncertainty'. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, 'You should call it entropy, for two reasons: In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage." https://en.wikipedia.org/wiki/History_of_entropy
Here Sabine Hossenfelder extensively uses the term "entropy" (nobody knows what entropy really is so she sounds very convincing):
View: https://www.youtube.com/watch?v=2QnRpinVmo4&t=8s
Is there any proof that the entropy is a state function? There is none. Here is the oversimplified history of the entropy concept:
If you define the entropy S as a quantity that obeys the equation dS=dQrev/T, you will find that, so defined, the entropy is a state function FOR AN IDEAL GAS: https://socratic.org/questions/is-entropy-state-function-how-prove-it Clausius decided to prove that the entropy (so defined) is a state function for ANY system. He based his proof on the assumption that any cycle can be disintegrated into small Carnot cycles, and nowadays this proof remains the only justification of "Entropy is a state function":
"Carnot Cycles: S is a State Function. Any reversible cycle can be thought of as a collection of Carnot cycles - this approximation becomes exact as cycles become infinitesimal. Entropy change around an individual cycle is zero. Sum of entropy changes over all cycles is zero." http://mutuslab.cs.uwindsor.ca/schurko/introphyschem/lectures/240_l10.pdf
The assumption on which "Entropy is a state function" is based - that any cycle can be subdivided into small Carnot cycles - is obviously false. An isothermal cycle CANNOT be subdivided into small Carnot cycles, a cycle involving the action of conservative forces CANNOT be subdivided into small Carnot cycles, etc. etc.
Conclusion: The belief that the entropy is a state function is totally unjustified. Any time scientists use the term "entropy", they don't know what they are talking about:
"My greatest concern was what to call it. I thought of calling it 'information', but the word was overly used, so I decided to call it 'uncertainty'. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, 'You should call it entropy, for two reasons: In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage." https://en.wikipedia.org/wiki/History_of_entropy
Here Sabine Hossenfelder extensively uses the term "entropy" (nobody knows what entropy really is so she sounds very convincing):