A Theory of Life… The Physics of Cells and Macroscopic Irreversibility

A meme that reads: "Life has No Ctrl + Z".“It’s Life!  But Not as We Know it…”

There is one essential difference between living things and inanimate clumps of carbon atoms.  From an all-physical point of view, the former tend to be so much better at capturing energy from their environment and dissipating that energy as heat.  At MIT, Jeremy England derived a mathematical formula that he believes explains this capacity. 

Every species of living thing can make a copy of itself by exchanging energy and matter with its surroundings.  One feature common to all such examples of spontaneous “self-replication” is their statistical irreversibility.

Clearly, one bacterium is much more likely to turn into two, than two bacteria spontaneously reverting back into one.  From the point of view of Physics, this observation contains an intriguing hint of how the properties of self-replicators must be constrained by thermodynamic laws, which dictate that irreversibility is always accompanied by an increase of entropy.

England, who trained in both Biochemistry and Physics, started his own lab at MIT (Massachusetts Institute of Technology) four years ago and decided to apply his new knowledge to the statistical physics to biology.

 

A Statistical Probability

A graph showing the Maxwell-Boltzmann distribution of velocities at energies corresponding to temperatures T = 200K, T = 400K and T = 800 K.
In statistical mechanics, the Boltzmann distribution is a probability distribution that gives the probability that a system will be in a certain state as a function of that state’s energy and the temperature of the system.  As temperature increases, the probability of finding molecules at higher energy increases.  Source: InformationPhilosopher.com

For a long time, it has been considered challenging to speak in universal terms about the statistical physics of living systems, because they invariably operate very far from thermodynamic equilibrium, and therefore need not obey a simple Boltzmann probability distribution over microscopic arrangements.  Past hypotheses about life’s origins have involved a primordial and one immense bit of luck.  But England’s theory is different.

According to this physicist, the origin of life and its subsequent evolution follow from the fundamental laws of nature, and they ought to be

“As unsurprising as rocks rolling downhill.”

 

Little by little, under certain conditions, England believes matter inexorably acquires the key physical attribute associated with life.

 

A New Formula for Life

The formula is based on established Physics.  It indicates that when a group of atoms is driven by an external source of energy, such as the Sun, and surrounded by a heat bath, just like the ocean or the atmosphere, this group of atoms will restructure itself in order to dissipate increasingly more energy.

“You start with a random clump of atoms, and if you shine light on it for long enough, it should not be so surprising that you get a plant.”

 

An original illustration by Charles Darwin of his famous Galapagos finches.
Darwin’s finches

Darwin’s theory of evolution by natural selection provides a powerful description of life at the level of genes and populations.  However, from the perspective of the physics, you might call Darwinian evolution, a special case of a more general phenomenon.

England’s theory is aiming to underlie, rather than replace.  But his idea, detailed in a recent paper, sparked controversy among his colleagues, who see it as either tenuous or a potential breakthrough, or both.

At this point, England’s ideas are extremely speculative when it comes to life phenomena.  But potentially promising.  The hope is that he has identified the underlying physical principle driving the origin and evolution of life.

England’s theoretical results are broadly considered valid.  It is his interpretation – that his formula represents the driving force behind a class of phenomena in nature that includes life – that remains unproven.  But already, there are ideas about how to test that interpretation in the lab.

This is radically different.  Harvard Physics professor Mara Prentiss is contemplating such an experiment after learning about England’s work:

“Right or wrong, it’s going to be very much worth the investigation.”

 

An image illustrating the concept of entropy. The picture shows a coffee cup being dropped and subsequently breaking into pieces on the floor.The Second Law of Thermodynamics

At the very heart of England’s idea is the second law of thermodynamics, also known as the law of increasing entropy or the “arrow of time.”

It’s the great inevitable…

Hot things cool down, gas diffuses through air, eggs scramble but never spontaneously unscramble.

Energy tends to disperse or spread out as time progresses.

 

Entropy

Entropy is a measure of this tendency, quantifying how dispersed the energy is among the particles in a system, and how diffuse those particles are throughout space.  Entropy increases as a simple matter of probability.  There are more ways for energy to be spread out than for it to be concentrated.  As particles in a system move around and interact, they will tend to adopt configurations in which the energy is spread out.  Eventually, the system arrives at a state of maximum entropy called “thermodynamic equilibrium,” in which energy is uniformly distributed.

Hence a cup of coffee and the room it sits in become the same temperature.

 

Irreversible Process

As long as the cup and the room are left alone, this process is irreversible.  The coffee never spontaneously heats up again, because the odds are overwhelmingly stacked against so much of the room’s energy randomly concentrating in its atoms.

By the same token, if you should drop the cup on a hard surface, it will break.  And no amount of super-strong glue will ever see this cup spontaneously assemble itself back together…

 

Life does not violate the Second Law of Thermodynamics

A drawing explaining a plant's photosynthesis cycle, using sunlight to absorb and process carbon dioxide into a source of energy and oxygen.
A plant’s life cycle is driven by photosynthesis.

Although entropy must increase over time in an isolated or “closed” system, an “open” system can keep its entropy low – that is, divide energy unevenly among its atoms – by greatly increasing the entropy of its surroundings.  In his influential 1944 monograph “What Is Life?”, Quantum physicist Erwin Schrödinger argued that this is precisely what living things must do.

For example, a plant absorbs extremely energetic sunlight, uses it to build sugars, and ejects infra-red light, a much less concentrated form of energy.  The overall entropy of the Universe increases during photosynthesis as the sunlight dissipates, even as the plant prevents itself from decaying by maintaining an orderly internal structure.

Yet until recently, physicists were unable to use thermodynamics to explain why it should arise in the first place.  In Schrödinger’s day, physicists could solve the equations of thermodynamics only for closed systems in equilibrium

In the 1960s, Belgian physicist Ilya Prigogine made progress on predicting the behaviour of open systems weakly driven by external energy sources (for which he won the 1977 Nobel Prize in chemistry).

 

Self-organization in Non-Equilibrium Systems – From Dissipative Structures to Order Through Fluctuations

 

A photograph of Belgian physicist Ilya Prigogine and quote by him saying: "The irreversibility of time is the mechanism that brings order out of chaos".The emphasis of Classical Physics has consistently been stability.  Yet, today’s thinking has evolved to the realisation that such a qualification applies only to limited aspects.

Because evolutionary processes lead to vast diversification and increasing complexity, the behaviour of a macroscopic system with many interacting sub-units can substantially differ from the random superposition of the independent sub-units’ evolution.

This monograph explores the self–organisation phenomena arising in such systems.  While relatively new, this field of investigation already encompasses a wide range of problems from chemistry to biology and population dynamics.

With emphasis on non-linear interactions and non-equilibrium constraints, the authors explain how departures from incoherent behaviour are induced.  Bifurcation and probability theories are used to explain evolution and pattern formation, and examples of self-organisation such as oscillating reactions, enzymatic reactions, cellular regulation, morphogenesis, pre-biotic evolution, population dynamics, and socio-biology.

It stresses the universality of concepts and mechanisms underlying self-organisation, as opposed to the diversity of the fields to which they apply. 

 

But the behaviour of systems that are far from equilibrium, which are connected to the outside environment and strongly driven by external sources of energy, could not be predicted.

 

Jarzynski and Crooks

During the late 1990s, it all changed, due primarily to the work of American scientists Chris Jarzynski and Gavin Crooks who showed that the entropy produced by a thermodynamic process, such as the cooling of a cup of coffee, corresponds to a simple ratio.

The Jarzynski equality is an equation in statistical mechanics that relates free energy differences between two states and the irreversible work along an ensemble of trajectories joining the same states.  Basically, the probability that the atoms will undergo that process divided by their probability of undergoing the reverse process (i.e. spontaneously interacting in such a way that the coffee warms up).

\frac {probability \: that \: the \: atoms \: will \: undergo \: that \: process}{probability \: that \: the \: atoms \: will \: undergo \: the \: reverse \: process}

As entropy production increases, so does this ratio.  A system’s behaviour becomes more and more “irreversible.”

 

Entropy production fluctuation theorem and the nonequilibrium work relation for free energy differences

 

A photograph of American scientist Gavin Crooks.
Gavin Crooks Source: Wikipedia

The simple, yet rigorous, formula could in principle be applied to any thermodynamic process, no matter how fast or far from equilibrium.  Our understanding of far-from-equilibrium statistical mechanics have greatly improved.

There are only a very few known relations in statistical dynamics that are valid for systems driven arbitrarily far-from-equilibrium.

One of these is the fluctuation theorem, which places conditions on the entropy production probability distribution of non-equilibrium systems.  Another recently discovered far-from-equilibrium expression relates non-equilibrium measurements of the work done on a system to equilibrium free energy differences.

In his 2008 paper, Crooks derives a generalised version of the fluctuation theorem for stochastic, microscopically reversible dynamics. Invoking this generalized theorem provides a succinct proof of the non-equilibrium work relation.

 

Using Jarzynski and Crooks’ formulation, Jeremy England derived a generalisation of the second law of thermodynamics that holds for systems of particles with certain characteristics.

The systems are strongly driven by an external energy source such as an electromagnetic wave, and they can dump heat into a surrounding bath.

This class of systems includes all living things.

A photograph of American physicist Jeremy England.
Jeremy England

England determined how such systems tend to evolve over time as they increase their irreversibility.  From the formula, the more likely evolutionary outcomes are going to be the ones that absorbed and dissipated more energy from the environment’s external drives on the way to getting there.

The finding makes intuitive sense.

Particles tend to dissipate more energy when they resonate with a driving force, or move in the direction it is pushing them, and they are more likely to move in that direction than any other at any given moment.

Potentially, this means clumps of atoms surrounded by a bath at some temperature, like the atmosphere or the ocean, should tend over time to arrange themselves to resonate better and better with the sources of mechanical, electromagnetic or chemical work in their environments.

Atoms might arrange themselves to resonate better with their environments.

 

Statistical Physics of Self-Replication

A diagram representing self-replicating sphere clusters.
Self-Replicating Sphere Clusters According to new research at Harvard, coating the surfaces of microspheres can cause them to assemble spontaneously into a chosen structure, such as a polytetrahedron (red), which then triggers nearby spheres into forming an identical structure.

Self-replication is a capacity common to every species of living thing, and simple physical intuition dictates that such a process must invariably be fuelled by the production of entropy.

Self-replication (or reproduction, in biological terms), the process that drives the evolution of life on Earth, is one such mechanism by which a system might dissipate an increasing amount of energy over time.  A great way of dissipating more energy is to make more copies of yourself.

In a paper in the Journal of Chemical Physics, England reported the theoretical minimum amount of dissipation that can occur during the self-replication of RNA molecules and bacterial cells, and showed that it is very close to the actual amounts these systems dissipate when replicating.

He showed that RNA, the nucleic acid that many scientists believe served as the precursor to DNA-based life, is a particularly cheap building material.  According to him, once RNA arose, its “Darwinian takeover” was perhaps unsurprising.

The chemistry of the primordial soup, random mutations, geography, catastrophic events and countless other factors have contributed to the fine details of Earth’s diverse flora and fauna.  But according to England’s theory, the underlying principle driving the whole process is dissipation-driven adaptation of matter.

 

Living or Inanimate Systems

Two pictures from a computer simulation showing dissipation-driven adaptation of matter.
A computer simulation by Jeremy England and colleagues shows a system of particles confined inside a viscous fluid in which the turquoise particles are driven by an oscillating force. Over time, the force triggers the formation of more bonds among the particles.

This principle might apply to inanimate matter, as well.

Scientists have already observed self-replication in non-living systems.  According to new research led by Philip Marcus of the University of California, Berkeley, and reported in Physical Review Letters in August, vortices in turbulent fluids spontaneously replicate themselves by drawing energy from shear in the surrounding fluid.  And in a paper published in Proceedings of the National Academy of Sciences, Professor Michael Brenner and his Harvard collaborators in Applied Mathematics and Physics, presented theoretical models and simulations of micro-structures that self-replicate.  These clusters of specially coated micro-spheres dissipate energy by roping nearby spheres into forming identical clusters.

Besides self-replication, greater structural organisation is another means by which strongly driven systems ramp up their ability to dissipate energy.  A plant is much better at capturing and routing solar energy through itself than an unstructured heap of carbon atoms.

Under certain conditions, matter will spontaneously self-organise.

 

This tendency could account for the internal order of living things and of many inanimate structures as well.

Snowflakes, sand dunes, turbulent vortices all have in common that they are strikingly patterned structures that emerge in many-particle systems driven by some dissipative process.

Condensation, wind, viscous drag are the relevant processes in these particular cases.

 

Energy Dissipation: Entropy

An illustration showing the entropy of snowflake, from an ordered icy structure melting into disordered water molecules.
The entropy of a snow flake.

If England’s new theory of the Statistical physics of self-replication of life is correct, the very physics that it identifies could be responsible for the origin of living things could explain the formation of many other patterned structures in Nature.

Snowflakes, sand dunes and self-replicating vortices in the proto-planetary disk may all be examples of dissipation-driven adaptation.

England’s bold idea will likely face close scrutiny in the coming years.  Currently, he will be running computer simulations to test his theory that systems of particles adapt their structures to become better at dissipating energy.  The next step will be to run experiments on living systems.

However, there are scientists who believe that England’s theory could be tested by comparing cells with different mutations and looking for a correlation between the amount of energy the cells dissipate and their replication rates.

Any entity that makes copies of itself is built out of some particular set of materials with a contingent set of properties and modes of interaction or inter-combination.  The liberation of carbon from various metabolites increases entropy by shuffling around vibrational and rotational degrees of freedom.

The cost for aerobic bacterial respiration is relatively small, and substantially outstripped by the sheer irreversibility of the self-replication process as it churns out copies that cannot easily disintegrate into their constituent parts.

During division, the bacterium consumes a number of oxygen molecules roughly equal to the number of amino acids in the new cell it creates.

The resulting population dynamics must obey the same general relationship entwining heat, organisation, and durability.

According to Prentiss, if energy dissipation and replication rate are indeed correlated, it suggests this may be the correct organizing principle.

But it remains to be seen whether the theory correctly predicts which self-replication and self-assembly processes can occur – a fundamental question in science.

 

Cell Division and Macroscopic Irreversibility

A diagram illustrating the chemical process of DNA methylation.
DNA Methylation – Our DNA is wrapped around histones (blue spheres) and organized into structures called nucleosomes.  Chemically-added tags such as methylation (red circles) can affect how open the DNA is to the cell’s machinery and consequently, how high or low a gene is expressed.  Source: cnx.org

Having an overarching principle of life and evolution would give researchers a broader perspective on the emergence of structure and function in living things.

Natural selection does not explain certain characteristics.  These characteristics include a heritable change to gene expression – called methylation – increases in complexity in the absence of natural selection, and certain molecular changes.

An animation showing electron microscope imaging of epithelial cells undergoing meiosis.
Living epithelial cells undergoing mitosis – the process by which most cells divide to produce two copies of themselves.

The process of cellular division, even in a creature as ancient and streamlined as a bacterium, is so bewilderingly complex that it may come as some surprise that physics can make any binding pronouncements about how fast it all can happen.  The reason this becomes possible is that non-equilibrium processes in constant temperature baths obey general laws that relate forward and reverse transition probabilities to heat production.

Previously, such laws had been applied successfully in understanding thermodynamics of copying “informational” molecules such as nucleic acids.  However, the information content of the system’s molecular structure could more easily be taken for granted, in light of the clear role played by DNA in the production of RNA and protein.

According to England, “it is remarkable that in a single environment, the organism can convert chemical energy into a new copy of itself so efficiently that if it were to produce even a quarter as much heat it would be pushing the limits of what is thermodynamically possible!”

A cartoon illustrating the idea of algorithms and life. Two DNA strands are playing video games, encouraged by two others. The captions read: "Erg, this level is really hard." "Try the algorithm." "Yeah, do it!" "It worked last time."
DNA does algorithm. Applying game theory to the behaviour of genes provides a new view of natural selection. Source: QuantaMagazine.org

If England’s approach stands up to more testing, it could further liberate biologists from seeking a Darwinian explanation for every adaptation and allow them to think more generally in terms of dissipation-driven organization.

The reason why an organism displays a characteristic X rather than Y may not be, after all, because X is fitter than Y, but because physical constraints make it easier for X, rather than Y, to evolve.

Whether or not England’s ideas turn out to be exactly right, thinking more broadly about a problem is where many scientific breakthroughs are made.

This insight might spur interest into the physics of natural selection in non-equilibrium system.  And who knows where his new theory of life might drive the rest of science…?

After all, life has no Ctrl+Z