Work with me, please.
by The Flash (2019-09-22 18:46:00)

In reply to: I didn't read the whole paper but some of the ...  posted by Barney68


Because your and the author's statements are all new to my formerly naive
idea of entropy, which was not disorder but permanent, irreversible change
of the system from one state to another state, not an infinite number of
states, except for all the intermediary states from the first to the last.

You wrote:
1. "Entropy is driven by the number of possible states in the system.
Disorder is the response of the system to that number of possible states;
equilibrium occurs when the likelihood of every possible state is equal
and disorder is at a maximum."

Hence, is "entropy" merely a descriptor for the complexity or degrees of
freedom in the system? Is entropy a "function" of the complexity or degrees
of freedom in the system? What is the measure of "disorder," of which we
must find the maximum? What is the measure of the "likelihood" of every
possible state? Are those, can those measures ever be equalized even in a
simple system with only two elements?


2. "If there is no change to the system, entropy remains the same, but so
does the number of possible states and with that number of possible
states, so does the disorder."

So, entropy always is the consequence of change? But the number of possible
states, even before any change occurs, in a continuous or flowing system
is infinite in possibilities and number, isn't it?

3. "Taking the water example, a given water molecule can be anywhere in
the system. That does not change as long as the system is isothermal and
sealed or isentropic."

So, entropy as a description of the state of the system, is a function of
the location of the elements in the system?

Thank you.


Replies: