A friend of mine once casually asked me over a drink: “What is entropy?” Eeek! Interesting concept. But… How do you define entropy in a non-mathematical way? How can you sum up entropy in 30 seconds? In one mental image. In a single concept… In one word. A form of energy? A measure of disorder in the Universe? Randomness? All of the above? Tricky question. And then, I dropped my glass…
In Science, a process that is not reversible is called ‘irreversible’. Okay, so far… It’s pretty much what you would intuitively expect. The concept arises most frequently in the frame of Thermodynamics, as applied to processes.
If you look at it this way, entropy is a measure of the changes in states of a thermodynamic system.
Entropy is a measure of disorder.
All natural complex processes are irreversible.
This phenomenon of irreversibility results from the fact that if a thermodynamic system – which really means any system of sufficient complexity – of interacting molecules is brought from one thermodynamic state to another, the configuration or arrangement of the atoms and molecules in the system will change in a way that is not easily predictable.
During transformation, there will be a certain amount of heat energy loss or dissipation due to intermolecular friction and collisions.
A certain amount of “transformation energy” S will be expended as the molecules of the “working body” do work on each other when they change from one state to another. Should the process be reversed, that energy S will typically NOT be recoverable. Bear with me…
In Thermodynamic Equilibrium…
A thermodynamic system can be defined in terms of its states, specified in terms of physical and chemical variables that describe the macroscopic properties of that system. Equilibrium means a state of balance.
For a thermodynamic system to be in a state of equilibrium, there must be no net flows of matter or energy, no phase changes, nor unbalanced potentials or driving forces, within the system.
In Thermodynamics, a change in the thermodynamic state of a system and all of its surroundings cannot be precisely restored to its initial state by infinitesimal changes in some property of the system without expenditure of energy. This is because as stated by the
Theoretically-speaking, a reversible process, or reversible cycle, can be “reversed” by means of applying infinitesimal changes to some property of the system, as long as this process can occur without entropy production – that is to say, without any dissipation of energy in the system.
Due to these infinitesimal changes, the system remains in thermodynamic equilibrium throughout the entire process. BUT… and it’s a big but…
Since it would take an infinite amount of time for the reversible process to finish, perfectly reversible processes are impossible.
A system that undergoes an irreversible process may still be capable of returning to its initial state. However, the impossibility occurs in restoring the environment to its own initial conditions because
An irreversible process increases the entropy of the Universe.
But, because entropy is a state function, the change in entropy of a system is the same whether the process is reversible or irreversible. The second law of thermodynamics can be used to determine whether a process is reversible or not.
Order into Disorder: Defining Entropy
Broadly speaking, entropy is then a measure of ‘disorder’.
Classic examples for depicting entropy include:
a dropped cup or egg: it will smash into pieces upon reaching the floor, but those pieces will never spontaneously recombine back into a cup or an unbroken egg.
a hot cup of coffee: it will always cool down if left untouched, but it will never draw warmth from a room to heat itself back up.
Another example of entropy is given by comparing steam and ice. The random motion of molecules in steam corresponds to more disorder, and hence more entropy, than the more orderly motion of molecules in ice or even water.
The term ‘entropy‘ was deliberately chosen to be reminiscent of energy, though the differences between the two quantities are just as important as their similarities. Entropy and energy are similar in that an isolated body may be said to have a certain ‘entropy content’, just as it may be said to have a certain ‘energy content’.
And you will NEVER see heat flowing spontaneously from a cold body to a hotter one.
When heat flows between a steak (or veggie burger for the vegetarians) and a plate, there is no violation of energy conservation: the energy lost by the steak is gained by the plate. However, the law of conservation of energy does not explain why the heat always flows from the hot steak to the cold plate; this is where the second law of thermodynamics comes in.
Budding inventors should note: an engine may convert energy from one form to another, but an engine cannot produce energy out of nothingness. The kinetic energy of the piston of a steam engine, for instance, has been paid for in advance by the heat transferred to the steam. Let that be a lesson to all those who ever dreamed of achieving “perpetual motion”.
Whenever energy is transferred or transformed, the final entropy of the Universe must be at least as high as the initial entropy. Usually, this means that heat flows are required to ensure that the total entropy does not decrease.
Inventors should again take note.
In most engines, heat is an unwanted by-product: the real aim is to transfer energy as work, perhaps to propel a vehicle or lift a weight. Since part of the energy initially stored in the fuel is inevitably wasted as heat, only a fraction is left to do useful work.
For this very reason,
Thermodynamics imposes fundamental limits on the efficiency of engines.
Fortunately, thermodynamics also suggests ways of increasing efficiency, which explains, for example, why a diesel engine is likely to be more efficient than a petrol engine. But I will go into that some other time…
When a room-temperature object is placed inside a refrigerator, heat flows out of the object and its entropy decreases. Indeed, the refrigerator may be said to be a device for sucking entropy out of warm objects. How can such a decrease in entropy be consistent with the second law of thermodynamics?
The total energy gained by a system (such as the plate) is the sum of the heat and the work transferred to it. And it is worth emphasising that heat and work are not themselves properties of a system. We cannot examine a plate and deduce that it has received so much energy from heat and so much energy from work. All that really counts is that the plate has a total amount of energy, and that any increase in this energy is the sum of the heat and work transferred to the plate.
This understanding of heat, work and energy is incorporated in the first law of thermodynamics.
From a modern perspective, the first law of thermodynamics is simply another way of stating the law of conservation of energy with the explicit recognition that you should take into account heat as a quantity of energy, alongside work, in any calculation.
I said it was difficult to define entropy properly. I say it again.
While the first law of thermodynamics ensures that the energy of an isolated system is always conserved, the second law of thermodynamics makes a slightly weaker assertion about entropy:
The requirement that the total entropy should not decrease has the effect of ruling out enormous numbers of processes that are perfectly consistent with the idea of energy conservation.
You Cannot Turn Back the Clock…
Many biological processes that were once thought to be reversible have been found to actually be a pairing of two irreversible processes. Whereas a single enzyme was once believed to catalyse both the forward and reverse chemical changes, research has found that two separate enzymes of similar structure are typically needed to perform what results in a pair of thermodynamically irreversible processes.
Recent advances in fields ranging from cosmology to computer science have hinted at a possible deep connection between intelligence and entropy maximisation, although a formal physical relationship between the two has yet to be established. In Cosmology, the causal entropic principle for anthropic selection has used the maximisation of entropy production in causally connected space-time regions as a thermodynamic proxy for intelligent observer concentrations in the prediction of cosmological parameters.
In Geoscience, entropy production maximisation has been proposed as a unifying principle for non-equilibrium processes underlying planetary development and the emergence of life.
Causal Entropy and Human Evolution
The intuitive idea of entropy describes the way in which the Universe heads inexorably toward a higher state of disorder. A modification to this fundamental law of Physics may provide a link to the rise of intelligence, cooperation, and even upright walking.
Over the past 10 to 15 years, many researchers from a variety of different scientific disciplines have hinted that there was a deep link between entropy production and intelligence.
A mathematical model published last year in the Physical Review Letters proposes that systems maximise entropy in the present and the future. Simple simulations based on the idea reproduce a variety of real-world cases that reflect intelligence.
Wissner-Gross and Freer argue that intelligent behaviour – something that is inherently hard to quantify – can be reduced to maximising one’s options – something that is relatively easy to quantify. However, it does not explain intelligent behaviour from first principles. The paper cannot explain how that ‘intelligent agent’ evolved in the first place, nor why it seeks to maximise future options.
Among a number of examples, the simplest model considers the behaviour of a pendulum hanging from a moving cart. Simulations of the causal entropy idea show that the pendulum ultimately ends up pointing upwards – an unstable situation, but one in which the pendulum is free to explore a broader range of alternative positions. The researchers liken this situation to the development of upright walking.
Probably not. I wish it did…