Work with me, please.
by The Flash (2019-09-22 18:46:00)

In reply to: I didn't read the whole paper but some of the ...  posted by Barney68


Because your and the author's statements are all new to my formerly naive
idea of entropy, which was not disorder but permanent, irreversible change
of the system from one state to another state, not an infinite number of
states, except for all the intermediary states from the first to the last.

You wrote:
1. "Entropy is driven by the number of possible states in the system.
Disorder is the response of the system to that number of possible states;
equilibrium occurs when the likelihood of every possible state is equal
and disorder is at a maximum."

Hence, is "entropy" merely a descriptor for the complexity or degrees of
freedom in the system? Is entropy a "function" of the complexity or degrees
of freedom in the system? What is the measure of "disorder," of which we
must find the maximum? What is the measure of the "likelihood" of every
possible state? Are those, can those measures ever be equalized even in a
simple system with only two elements?


2. "If there is no change to the system, entropy remains the same, but so
does the number of possible states and with that number of possible
states, so does the disorder."

So, entropy always is the consequence of change? But the number of possible
states, even before any change occurs, in a continuous or flowing system
is infinite in possibilities and number, isn't it?

3. "Taking the water example, a given water molecule can be anywhere in
the system. That does not change as long as the system is isothermal and
sealed or isentropic."

So, entropy as a description of the state of the system, is a function of
the location of the elements in the system?

Thank you.


Doing my best ...
by Barney68  (2019-09-22 22:00:40)     cannot delete  |  Edit  |  Return to Board  |  Ignore Poster   |   Highlight Poster  |   Reply to Post

Entropy is a very simple concept that is very complex in application. For a fuller understanding of it, I'd suggest two of the courses in the Teaching Company's Great Courses series: "Thermodynamics: Four Laws that Move the Universe," and "Mysteries of Modern Physics: Time." They explain it better than I can.

That said ...

1. No, entropy is not merely a descriptor. It is a real thermodynamic function with a real equation (δqrev/T = ΔS) that comes up in many ways. I learned it as an undergrad in the context of efficiency: no system can ever be 100% efficient, there are always losses, entropy always increases; that's the second law of thermodynamics. It's a mathematical measure of inefficiency.

2. If the system is totally static, entropy remains constant. That's a theoretical situation, not a real one. Any change in a real, that is an imperfect, system results in an increase in entropy.

An example: the kilogram used to be referenced to a chunk of platinum held in a vault in France. You might think that was a constant, but it was not because every time someone picked it up to put it on a balance to certify a copy, a few atoms would come off due to friction and be lost to the reference. That loss of atoms was an increase in entropy.

3. The entropy of the system is a measure of how disordered the system is; the change in entropy is a measure of a loss of order.

Imagine a disk brake on a car. The disk and the pad are ordered systems, the car has kinetic energy. When the driver wants to stop, the brake is applied and energy dissipated as heat, the disk and pad wear as part of the process and eventually have to be replaced. That's an intentional increase in entropy for the overall system because the driver uses that overall increase in disorder (heat lost to the environment, damage to the disk and pad) to achieve the desired end state of stopping the car. This example shows why using electrical braking to recharge the battery improves efficiency: disorder is still increased, but not as much.

Does that answer your question?


Oh. That is excellent. I understand it. Thank you. However,
by The Flash  (2019-09-23 18:56:02)     cannot delete  |  Edit  |  Return to Board  |  Ignore Poster   |   Highlight Poster  |   Reply to Post

I am unclear about the part where it has been stated that entropy is a
count of the number of possible states the system can hold.

If I understand that correctly, it would not matter what system it is, if it
is a complex weather pattern, water molecules in the ocean, thermodynamics in
a volcano, an explosion inside a tank, or just one CO2 molecule inside a
balloon. They will all have an infinite number of possible states, some
more infinite than others (lol) by order of volume, won't they? The entropy
will exponentiate quickly as the number of elements in the system increases, won't it?

So, if that is the case, how does a measure of that number of possible
states even matter, since it will always be infinity?

Also, it it possible entropy is not a measure of how the system got
"disordered" so much as it is just got rearranged, or moved differently?

For the sake of another example, years ago I worked for an oil shale
development company. The engineers and scientists who invented the patented
process (the only one in the world that ever worked) told me that they
never fully understood what occurred thermodynamically inside our retort.
However, they knew enough to make it work economically. Pour oil shale
into it, add heat, remove shale oil, carbon, and off gases. So, to the
human mind it is orderly enough to be practical, just rearranged and then
diverted physically.

Another example might be the Chicxulub asteroid impact event 66 million
years ago. The earth remained in a fairly steady state by geologic standards,
the asteroid struck, entropy increased, entropy decreased, a new steady
state started. I watched a great Netflix documentary about it last night.
It is called "Day the Dinosaurs Died." Scientists have drilled cores out
of the impact site (down 1/2 mile) that depict the layers of matter before
and after. Really cool. (https://www.netflix.com/title/81121175)