-ENTROPY-

.

-as of [20 SEPTEMBER 2024]-

.

-in ‘statistical mechanics’, entropy is an ‘extensive property’ of a ‘thermo-dynamic system’-

.

[it is closely related to the number Ω of ‘microscopic configurations’ (known as ‘micro–states’) that are consistent with the ‘macroscopic quantities’ that characterize the ‘system’ (such as its ‘volume’ / ‘pressure’ / ‘temperature’)]

(under the assumption that each ‘micro-state’ is equally probable, the entropy ‘S’ is the ‘natural logarithm’ of the number of ‘micro-states’, multiplied by the boltzmann constant kB)

.

(formally (assuming equiprobable micro-states)…)

.

(‘macro-scopic systems’ typically have a very large number Ω of ‘possible microscopic configurations’)

(for example, the ‘entropy’ of an ‘ideal gas’ is proportional to the number of gas molecules N)

(roughly 20 liters of ‘gas’ at ‘room temperature’ + ‘atmospheric pressure’ has N ≈ 6×1023 (‘avogadro’s number’))

(at ‘equilibrium’, each of the Ω ≈ eN configurations can be regarded as ‘random’ + ‘equally likely’)

(the ‘2nd law of thermodynamics’ states that the ‘entropy’ of an ‘isolated system’ never decreases over time)

(such systems spontaneously evolve towards ‘thermodynamic equilibrium’, the state with ‘maximum entropy’)

(‘non-isolated systems’ may lose ‘entropy’, provided their environment’s entropy increases by at least that amount so that the ‘total entropy’ increases)

(‘entropy’ is a ‘function’ of the ‘state’ of the ‘system’, so the ‘change’ in ‘entropy’ of a ‘system’ is determined by its ‘initial’ + ‘final’ states)

(in the ‘idealization’ that a ‘process’ is ‘reversible’, the ‘entropy’ does not change, while ‘irreversible processes’ always increase the ‘total entropy’)

(because it is determined by the # of ‘random micro-states’, ‘entropy’ is related to the amount of additional information needed to specify the ‘exact physical state’ of a ‘system’, given its ‘macroscopic specification’)

(for this reason, it is often said that ‘entropy’ is an expression of the ‘disorder’ (or ‘randomness’) of a ‘system’, or of the ‘lack of information about it’)

(the concept of ‘entropy’ plays a central role in ‘information theory’)

(‘boltzmann’s constant’ (and therefore ‘entropy’) have ‘dimensions’ of ‘energy’ divided by ‘temperature’, which has a unit of ‘joules per kelvin’ (J⋅K−1) in the ‘international system of units’ (or kg⋅m2⋅s−2⋅K−1 in terms of ‘base units’))

.

(the ‘entropy’ of a substance is usually given as an intensive property—either entropy per unit mass (SI unit: J⋅K−1⋅kg−1) or entropy per unit amount of substance (SI unit: J⋅K−1⋅mol−1))

.

.

*👨‍🔬🕵️‍♀️🙇‍♀️*SKETCHES*🙇‍♂️👩‍🔬🕵️‍♂️*

.

📚📖|/\-*WIKI-LINK*-/\|📖📚

.

.

👈👈👈☜*“ENERGY”* ☞ 👉👉👉

.

.

💕💝💖💓🖤💙🖤💙🖤💙🖤❤️💚💛🧡❣️💞💔💘❣️🧡💛💚❤️🖤💜🖤💙🖤💙🖤💗💖💝💘

.

.

*🌈✨ *TABLE OF CONTENTS* ✨🌷*

.

.

🔥🔥🔥🔥🔥🔥*we won the war* 🔥🔥🔥🔥🔥🔥