Entropy as disorder: History of a misconception
by cbiebel (2019-09-22 13:47:44)

I ran across this interesting paper on why so many people confuse entropy and disorder, which are not actually the same, and where this misconception comes from.






Looks like there are a few definitions
by SEE  (2019-09-23 09:37:09)     cannot delete  |  Edit  |  Return to Board  |  Ignore Poster   |   Highlight Poster  |   Reply to Post

The more human definition, to me, implies that we try to create order so that we can accomplish things and that the natural tendency of humans is toward disorder. We break that which we build.

And least that's how I've spun it.


Professor Scheidt always told us to "Fight entropy!" *
by Barrister  (2019-09-23 08:26:58)     cannot delete  |  Edit  |  Return to Board  |  Ignore Poster   |   Highlight Poster  |   Reply to Post


It's entropy, it's not a human issue.
by NDBass  (2019-09-22 21:27:00)     cannot delete  |  Edit  |  Return to Board  |  Ignore Poster   |   Highlight Poster  |   Reply to Post


I never really "got" entropy until I saw it in stat mech.
by Mr Wednesday  (2019-09-22 19:11:25)     cannot delete  |  Edit  |  Return to Board  |  Ignore Poster   |   Highlight Poster  |   Reply to Post

Which is, incidentally, the definition that is the basis for the paper (entropy as the count of available states for the system, or more specifically, the log of the number of available states for the system).


I first learned of entropy from the much simpler example....
by The Flash  (2019-09-22 18:30:50)     cannot delete  |  Edit  |  Return to Board  |  Ignore Poster   |   Highlight Poster  |   Reply to Post

....of Brownian motion, which is not even mentioned in this article.

Has entropy been extended to the social sciences? Is it purely a physical concept?


I didn't read the whole paper but some of the ...
by Barney68  (2019-09-22 17:22:12)     cannot delete  |  Edit  |  Return to Board  |  Ignore Poster   |   Highlight Poster  |   Reply to Post

early examples are, at best, a bit misleading based on my understanding of entropy.

Entropy is driven by the number of possible states in the system. Disorder is the response of the system to that number of possible states; equilibrium occurs when the likelihood of every possible state is equal and disorder is at a maximum.

If there is no change to the system, entropy remains the same, but so does the number of possible states and with that number of possible states, so does the disorder.

Taking the water example, a given water molecule can be anywhere in the system. That does not change as long as the system is isothermal and sealed or isentropic.


Work with me, please.
by The Flash  (2019-09-22 18:46:00)     cannot delete  |  Edit  |  Return to Board  |  Ignore Poster   |   Highlight Poster  |   Reply to Post

Because your and the author's statements are all new to my formerly naive
idea of entropy, which was not disorder but permanent, irreversible change
of the system from one state to another state, not an infinite number of
states, except for all the intermediary states from the first to the last.

You wrote:
1. "Entropy is driven by the number of possible states in the system.
Disorder is the response of the system to that number of possible states;
equilibrium occurs when the likelihood of every possible state is equal
and disorder is at a maximum."

Hence, is "entropy" merely a descriptor for the complexity or degrees of
freedom in the system? Is entropy a "function" of the complexity or degrees
of freedom in the system? What is the measure of "disorder," of which we
must find the maximum? What is the measure of the "likelihood" of every
possible state? Are those, can those measures ever be equalized even in a
simple system with only two elements?


2. "If there is no change to the system, entropy remains the same, but so
does the number of possible states and with that number of possible
states, so does the disorder."

So, entropy always is the consequence of change? But the number of possible
states, even before any change occurs, in a continuous or flowing system
is infinite in possibilities and number, isn't it?

3. "Taking the water example, a given water molecule can be anywhere in
the system. That does not change as long as the system is isothermal and
sealed or isentropic."

So, entropy as a description of the state of the system, is a function of
the location of the elements in the system?

Thank you.


Doing my best ...
by Barney68  (2019-09-22 22:00:40)     cannot delete  |  Edit  |  Return to Board  |  Ignore Poster   |   Highlight Poster  |   Reply to Post

Entropy is a very simple concept that is very complex in application. For a fuller understanding of it, I'd suggest two of the courses in the Teaching Company's Great Courses series: "Thermodynamics: Four Laws that Move the Universe," and "Mysteries of Modern Physics: Time." They explain it better than I can.

That said ...

1. No, entropy is not merely a descriptor. It is a real thermodynamic function with a real equation (δqrev/T = ΔS) that comes up in many ways. I learned it as an undergrad in the context of efficiency: no system can ever be 100% efficient, there are always losses, entropy always increases; that's the second law of thermodynamics. It's a mathematical measure of inefficiency.

2. If the system is totally static, entropy remains constant. That's a theoretical situation, not a real one. Any change in a real, that is an imperfect, system results in an increase in entropy.

An example: the kilogram used to be referenced to a chunk of platinum held in a vault in France. You might think that was a constant, but it was not because every time someone picked it up to put it on a balance to certify a copy, a few atoms would come off due to friction and be lost to the reference. That loss of atoms was an increase in entropy.

3. The entropy of the system is a measure of how disordered the system is; the change in entropy is a measure of a loss of order.

Imagine a disk brake on a car. The disk and the pad are ordered systems, the car has kinetic energy. When the driver wants to stop, the brake is applied and energy dissipated as heat, the disk and pad wear as part of the process and eventually have to be replaced. That's an intentional increase in entropy for the overall system because the driver uses that overall increase in disorder (heat lost to the environment, damage to the disk and pad) to achieve the desired end state of stopping the car. This example shows why using electrical braking to recharge the battery improves efficiency: disorder is still increased, but not as much.

Does that answer your question?


Oh. That is excellent. I understand it. Thank you. However,
by The Flash  (2019-09-23 18:56:02)     cannot delete  |  Edit  |  Return to Board  |  Ignore Poster   |   Highlight Poster  |   Reply to Post

I am unclear about the part where it has been stated that entropy is a
count of the number of possible states the system can hold.

If I understand that correctly, it would not matter what system it is, if it
is a complex weather pattern, water molecules in the ocean, thermodynamics in
a volcano, an explosion inside a tank, or just one CO2 molecule inside a
balloon. They will all have an infinite number of possible states, some
more infinite than others (lol) by order of volume, won't they? The entropy
will exponentiate quickly as the number of elements in the system increases, won't it?

So, if that is the case, how does a measure of that number of possible
states even matter, since it will always be infinity?

Also, it it possible entropy is not a measure of how the system got
"disordered" so much as it is just got rearranged, or moved differently?

For the sake of another example, years ago I worked for an oil shale
development company. The engineers and scientists who invented the patented
process (the only one in the world that ever worked) told me that they
never fully understood what occurred thermodynamically inside our retort.
However, they knew enough to make it work economically. Pour oil shale
into it, add heat, remove shale oil, carbon, and off gases. So, to the
human mind it is orderly enough to be practical, just rearranged and then
diverted physically.

Another example might be the Chicxulub asteroid impact event 66 million
years ago. The earth remained in a fairly steady state by geologic standards,
the asteroid struck, entropy increased, entropy decreased, a new steady
state started. I watched a great Netflix documentary about it last night.
It is called "Day the Dinosaurs Died." Scientists have drilled cores out
of the impact site (down 1/2 mile) that depict the layers of matter before
and after. Really cool. (https://www.netflix.com/title/81121175)



That's interesting, but also a bit maddening...
by Kbyrnes  (2019-09-22 16:35:00)     cannot delete  |  Edit  |  Return to Board  |  Ignore Poster   |   Highlight Poster  |   Reply to Post

...since he never defines "entropy" as he understands it. There are plenty of spots where he says that entropy isn't something, but not one where he says just what it is. He comes close when he cites Clausius:

"Finally, in 1865, he coined the term 'entropy,' writing, 'I propose to call the magnitude S the entropy of the body, from the Greek word τροπη, transformation.'"

It seems like the biggest problem is that the term "entropy," which was devised to express something about the state of a thermodynamic system, has come into use outside of thermodynamics. See link for another kind of discussion.

Some day I'll try to convince our younger daughter to write up her theory that we use the term "random" very loosely, and that even random number generators don't operate randomly. (The literature recognizes pseudo-random generation and "true" random generation, but her argument is that even the latter is random only because we can't predict the inputs, which would be predictable if we knew more about them.)


Entropy can be used to explain a lot of things.
by Barney68  (2019-09-22 17:52:04)     cannot delete  |  Edit  |  Return to Board  |  Ignore Poster   |   Highlight Poster  |   Reply to Post

Most fundamentally, increasing entropy is what demands that time move only in one direction, from past to future.

If you're interested in the Great Courses, they have a couple that deal with entropy extensively. A bit time consuming, but worth it, IMHO.


I must look for them because I don't understand correctly. *
by The Flash  (2019-09-22 18:48:38)     cannot delete  |  Edit  |  Return to Board  |  Ignore Poster   |   Highlight Poster  |   Reply to Post


Thanks for the share! *
by Porpoiseboy  (2019-09-22 14:34:28)     cannot delete  |  Edit  |  Return to Board  |  Ignore Poster   |   Highlight Poster  |   Reply to Post