Conflicting Perspectives on Entropy and the Second Law of Thermodynamics

eepperly16 09/22/2017. 2 answers, 195 views
thermodynamics statistical-mechanics entropy

In classical thermodynamics, entropy is postulated to exist and be a monotone, convex, extensive state function $S(E,V,N)$.

In statistical mechanics, the picture becomes blurrier. Let's restrict our discussion to the classical statistical mechanics of isolated systems for the moment. In the words of Leonard Susskind, in statistical mechanics entropy becomes a function of the system and your knowledge of it. For an isolated system, $S = \log \mathcal{W}$, where $\mathcal{W}$ is the number of microstates the system can be in given your state of knowledge of the system. Assuming one knows $E, V, N$ and the Hamiltonian of the system, $\mathcal{W}$ can be computed as the volume of a particular hypersurface (or a thin shell) in the phase space.

The natural question is whether or not the second law of thermodynamics can then be proven in the context of classical statistical mechanics. Two separate accepted and upvoted answers on this site seem to provide different answers. One argues quite persuasively that you cannot, that in order to derive the second law, one needs either asymmetric laws of motion or asymmetric initial conditions. The second says the second law may be proven, citing Boltzmann's H theorem.

I think underlying my confusion of this topics underlies a deeper confusion I have about what entropy actually means in statistical mechanics. Sometimes entropy has an almost Bayesian probability nature about it, a sort of "entropy measures one's ignorance of a system" (view 3 below.) Other contexts, entropy seems to play more like a state variable, a mathematical function $S(E, V, N)$ intrinsic to a system (views 1 and 2 below.)

  1. The entropy of a system in equilibrium is $S := \log \mathcal{W}(E,V,N)$. The entropy of a system out of equillibrium is undefined.

Under this view, how can we say the entropy of the universe is increasing: the universe is not equilibrium so its entropy is undefined.

  1. The entropy of a system in equilibrium is $S := \log \mathcal{W}(E,V,N)$. A system out of equillibrium is composed of many smaller subsystems each of which can be considered approximately in equilibrium. The entropy of the system is the sum of the entropies of the subsystems.

This raises the question of whether the entropy of the system is independent of how it is partitioned into subsystems or how "in equilibrium" said subsystems are required to be. If these questions are resolved, the second law in this context would be "the entropy of a collection of noninteracting subsystems increases as they are allowed to interact." This is an interpretation of the second law often "proven" in elementary statistical mechanics textbooks as a hand-wavey justification of the second law. (e.g. two ideal gases of different densities separated by a wall, the wall is taken down and the entropy increases.)

In this context, the deeper "reason" for entropy increase is that the system is approaching equilibrium. But why does the system approach equilibrium and not escape from it. Loschmidt's paradox and the assumption that the laws of motion are reversible would seem to imply it equally likely that a system would evolve away from an equillibrium state than toward one.

  1. Entropy, in a strict sense, does not exist. If observers A and B both observe a system, but A observes more about the system then B, then A's computed value of entropy will be lower than B's since there are fewer microstates consistent with A's more detailed observations of the system than are micro states consistent with B's less detailed observations.

Under this interpretation, how can entropy increase be formulated at all. What if a hypothetical observer C performs more and more observations as time progresses. Won't then the number of microstates shrink as time progresses as C's knowledge of the system increases. Further, if D knows the Hamiltonian and $E,V,N$ then in principle if D computed $\mathcal{W}$ at some time, then Liouville's should imply a "conservation of entropy" since Liouville's theorem implies a "conservation of phase space volume". The objection, I imagine, to D's approach would be that the system originally is not in equilibrium but this just brings us back to item 1.


The more statistical mechanics I read the more confused I become about the meaning of entropy and why the second law of thermodynamics is true. Can anyone shed any insight on which, if any, of my perspectives on entropy is "correct" and how some of the logical contradictions I see can be resolved?

2 Answers


lalala 09/23/2017.

You are raising a lot of questions, and I can understand that, since the whole topic is confusing (also to me). Let me try to give you some partial answers. Also thermodynamics is an 'old' topic with a long history (which makes it even more difficult).

Lets step back one more step. And start from the beginning and see what might hold later.

0) It is important have a foundation, that is, that Entropie is an experimental quantity. In experimental physics it is the heat devided by the temperature transfered to your system. (Of course, you would have to integrate from 0 Kelvin onwards, where you set the entropy to zero). It is an experimental 'fact' that for reversible processes this function is a state function.

1) "monotone, convex, extensive state function". So why do we believe this. State function basically from experiment, and from the impossibility of a perpetuum mobile of the second kind (also experimentally confirmed). Extensive: of course, if you think of heat capacity as a extensive quantity. Monotone in E: obvious if T is positive (you probably heard of 'negative' temperature, so this one might be changed in these special cases). Convexity:since dS = C d ln T and energy and T are monotonous, dS to be convex means positive heat capacity, which actually allows two bodies touching to reach equilibrium (convexity wont hold in the next section)

2) Already with classical thermodynamics we run into the problem 'entropy of universe' is increasing. First the universe is not in equilibrium. Ok, the standard solution is to go to local equilibrium (you would have to do the same in statistical mechanics). Now about the universe: gravitational interacting systems have a negative heat capacity (mentioned in Thirrings book, at that time, and maybe still is, its an numerical result, no analytical proof available). What this means, if two clouds of 'dust' or whatever interact, the hotter one gets hotter, the colder gets colder (less kinetic movement). This is important for star/galaxy formation. I just found these two papers (I dont know if they are good or not, but to show that I am talking mainstream physics).https://arxiv.org/pdf/1603.00044.pdf and http://adsabs.harvard.edu/full/1977MNRAS.181..405LAlso, even worse in an expanding universe energy conservation doesnt seem to work. Anyway, my interpretation of this is: the statement 'entropy of the universe is increasing' does not have a solid scientific foundation, and is more like a statement 'experimental setup + laboratory = universe' so all entropy changes of the experiment have to come from the surroundings. But I am not an expert on this.

Now going to statistical mechanics. Things dont get better, they get a little bit worse. Anything which is a problem in classical thermodynamics (apart from deriving the quantities from an underlying Hamiltonian) cannot be resolved in statistical mechanics. I say this again: the purpose of statistical mechanics was to be able to derive the measurable quantities from the Hamiltonian before 'supercomputers' existed. (and also analytical solutions are nicer than numerical ones)

3) From the viewpoint of classical mechanic, every system in principle corresponds to a point in phase space. Taking the statistical view on the whole systems and not on individual entities/particles is slightly artificial to classical mechanics, and in some sense it is not true. It is basically based on the fact that for large systems the most probable regions of phase space give similiar results. If you have a system where two parts of phase space describe wildly different macroscopic states this treatment in the strict sense fails. (Freezing water is an example)

So I wouldnt assign too much 'reality' to $W$.

4) "statistical mechanics entropy becomes a function of the system and your knowledge of it." I would rephrase this to 'macroscopic, objective restrictions'. Well, let me explain: if you have a gas, and for one instant, some guy gives you an USB-stick with all position and momenta of the particles, then even your 'knowledge' changed, entropy stayed the same. If the guy puts in a partition and all the gas is now only on one side (however he achieved that), then entropy changed. Change of entropy occurs if you restrict the accessible phase space and this is consistent with statistical mechanics. (Although you have to limit you calculation of phase space; this is a trivial example of how the naive approach fails. Take two vessels: one with gas, the other empty. Should you calculate the same phase space volume as if there is gas in both?)

Probably now I can adress the remaining of your questions:

Is the entropy independent of partitioning a system: yes! It has to be (because we would only accept a statmech def of entropy if it fulfills the experimental facts of 0). One has to adjust the definition of entropy to achieve this (by some permutation factor). This problem is called Gibbs paradox.

For your question 3): as outlined above, entropy has nothing to do with knowledge of person A und B (lets forget about Maxwells demon). At least in the statistical and classical description it doesnt.

For entropy change conflicting with Liouville theorem. Entropy change is a bit alien to the statistical approach, but lets try. Remember the 2 vessel example above. Of course if one contains a gas, the other one not, then if you connect them ,entropy changes, and statistical mechanics will agree, if you view this as an expansion of phase space (which it is). If you say the phase volume of the gas will stay the same, there are two answer, the phase volume will become 'foam' like, so it will fill the full phase space without violating volume conservation (personal note: I think this is medium convincing). My personal explanation is again, since the assigned phase space volume is artificial (and could change with knowledge), it is not important. The point is: what is the accessible phasespace volume compatible with macroscopic restrictions. So no real problem here. Of course, if you look at the instant the wall has been taken out, then you are in high-non equilibrium and for that instant statistical description is out of its depth.

Last (and probably least), a side note: Boltzmanns theorem is not an H but an capital eta (which actually looks the same as H, but is pronounced Eta).

Does this adress most of your questions?


Pentcho Valev 09/23/2017.

Entropy is not a state function. It was Clausius who said it is, but he was wrong. Here is the story:

If you define the entropy S as a quantity that obeys the equation dS=dQrev/T, you will find that, so defined, the entropy is a state function FOR AN IDEAL GAS. Clausius was very impressed by this statefunctionness and decided to prove that the entropy (so defined) is a state function for ANY system. So "Entropy is a state function" became a fundamental theorem in thermodynamics. Clausius deduced it from the assumption that any cycle can be disintegrated into small Carnot cycles, and nowadays this deduction remains the only justification of "Entropy is a state function":

"Carnot Cycles: S is a State Function. Any reversible cycle can be thought of as a collection of Carnot cycles - this approximation becomes exact as cycles become infinitesimal. Entropy change around an individual cycle is zero. Sum of entropy changes over all cycles is zero." http://mutuslab.cs.uwindsor.ca/schurko/introphyschem/lectures/240_l10.pdf

"Entropy Changes in Arbitrary Cycles. What if we have a process which occurs in a cycle other than the Carnot cycle, e.g., the cycle depicted in Fig. 3. If entropy is a state function, cyclic integral of dS = 0, no matter what the nature of the cycle. In order to see that this is true, break up the cycle into sub-cycles, each of which is a Carnot cycle, as shown in Fig. 3. If we apply Eq. (7) to each piece, and add the results, we get zero for the sum." http://ronispc.chem.mcgill.ca/ronis/chem213/hnd8.pdf

The assumption on which "Entropy is a state function" is based - that any cycle can be subdivided into small Carnot cycles - is obviously false. An isothermal cycle CANNOT be subdivided into small Carnot cycles. A cycle involving the action of conservative forces CANNOT be subdivided into small Carnot cycles.

Conclusion: The belief that the entropy is a state function is totally unjustified. Any time scientists use the term "entropy", they don't know what they are talking about.

Related questions

Hot questions

Language

Popular Tags