Entropy



Forgot where
you saw it?



------------------------------------
Do you want news from the Holon site?
Insert your e-mail address below!


Updated:
2013-01-06

Entropy (S) is a slippery measurement. It is often used very sloppy.

Often, entropy is referred to as a measurement of disorder (un-order). Why this is wrong is explained below.

Firstly, it is not defined far from thermodynamic equilibrium, i.e. it is impossible to tell the entropy of a chicken, or even a tabledesk.

I prefer the lengthy word 'low-exergy-energy' instead of 'entropy'.

Entropy is related to exergy in the respect that when the exergy diminishes in a system, entropy is increasing.

This should not be understood as entropy is negative exergy.
Look att the 'energy tube'. Imagine the paste coming out of it as exergy, and the invard dent as the entropy. In the toothpaste case, you can not take the dent and 'unbrush' your teeth.

Consider the figure below.
The horisontal axis represents different systems, from very ordered to chaotic.
The vertical axis could be called 'exergy consumption capacity' from zero and up, and 'entropy' below zero.
At absolute zero (-273,16 degrees) entropy is at a maximum. No work can be done. An example of a very ordered system is a chrystal. The system to the left in the figure is a chrystal at abolute zero. It is a perfectly ordered system.

Here, it is possible to calculate the postition of a particular atom from information of the positions of its neighbours.

In the middle is an organized system, as a chicken or another self-organizing system. Here, it is also impossible to calculate the postition of an atom from information of the positions of its neighbours.
But still, you can understand and predict the system, since it is organized. (Humans are very good at recognize and understand organization.)

In the organised system, the entropy is very low. Self-organising systems expel high-entropic energy/matter and develop to a state where their ability to consume exergy is at a local maximum.

To the right in the figure is the chaotic system. Here, it is impossible to calculate the postition of an atom from information of the positions of its neighbours. Also in this system, the entropy is very high.
Thus, if entropy might be called un- or dis-something, this something is organization.

This page is a part of a net-path. Click on the arrows to navigate on the path!