Cellular Thermodynamics: The Molecular and Macroscopic Views


On the macroscopic scale, biology obeys thermodynamics, and this article introduces the laws of thermodynamics and its macroscopic variables, including entropy. The molecular description is often more directly useful to biologists. So this article explains Boltzmann's distribution, which describes how the proportion of molecules having a particular energy depends on energy and temperature, and quantifies the compromise between minimising energy and maximising entropy. This simple law is then used to explain and quantify several aspects of the function and structure of cells. These include Årrhenius' law for reaction rates, the Nernst equation for electrochemical equilibria, and hydraulic and osmotic equilibria between solutions with different compositions. It also explains the equilibrium between different phases (solid, liquid and vapour), the aggregations of molecules in self‐assembly, aspects of the stability and mechanical homeostasis of membranes and entropic forces in biomolecules.

Key Concepts

  • Cells and components that are physically close together are often approximately in thermal and chemical equilibrium, which allows some powerful observations about their properties.
  • Cells and tissues sometimes approximate steady state: they input and output energy and matter at approximately equal rates and vary little over time.
  • Entropy measures the ‘uselessness’ of energy: the biosphere survives by converting energy with low entropy (light from the sun) into energy with high entropy (waste heat lost to the atmosphere and to space).
  • Entropy also quantifies disorder – but only at a molecular level.
  • The Boltzmann energy distribution leads directly to a number of important laws and effects relevant to cell biology:
    • Årrhenius' law of reaction rates,
    • ion equilibria,
    • osmotic pressure,
    • phase equilibria, and
    • molecular oligamerisation and aggregation.

Keywords: entropy; osmosis; Arrhenius' law; Nernst equation; molecular aggregation; oligamerisation; Gibbs free energy; chemical potential; membrane homeostasis; entropic forces

Figure 1. Thermal physics schematics of the Earth, a photosynthetic plant cell and an animal cell. Their macroscopically ordered structure – and anything else they do or produce – is ultimately maintained or ‘paid for’ by a flux of energy in from the sun (at ∼6000 K) and out to the night sky (at ∼3 K). The equal size of the arrows represents the near equality of average energy flux in and out. The entropy flux is not equal: the rate of entropy export is in all cases much greater than the entropy intake. Apart from exceptional ecosystems on the ocean floor that obtain their energy from geochemistry rather than from sunlight, all life on Earth depends on photosynthesis and, thus, on this energy and entropy flux. The Earth absorbs solar energy at a rate of about 5 × 1016 W, so the rate of entropy input is about 5 × 1016 W/6000 K ≅ 8 × 1012 W K−1. The Earth radiates heat at nearly the same rate, so its rate of entropy output is 5 × 1016 W/255 K ≅ 2 × 1014 W K−1. Thus, the biosphere creates entropy at a rate of nearly 2 × 1014 W K−1. Living things contribute only a fraction of this entropy but that is still a huge amount. Despite the disingenuous claims of some anti‐evolutionists, biological processes are in no danger of violating the second law of thermodynamics.
Figure 2. At equilibrium, molecules and their energies are distributed according to the Boltzmann distribution, sketched here. In states, phases or regions having different energies E, the concentrations or numbers of molecules are proportional to exp(−E/kT), where E is the energy per molecule, k = 1.38 × 10−23 J K−1 is the Boltzmann constant and T is the absolute temperature.
Figure 3. The variation with altitude of the concentration of gas in a gravitational field gives a simple derivation of the Boltzmann distribution, if we assume constant temperature. Concentration is high at ground level (low gravitational potential energy) and falls as the altitude h increases. The gravitational potential energy is Ug = Mgh per mole, where g is gravitational acceleration and M is the molar mass. Consider the volume element sketched, with thickness dh and area A: the pressure difference dP acting on area A supports the weight of the element (dP < 0). So AdP = −ρA dh g, where ρ is the density of the gas. The equation of state PV = nRT gives ρ = nM/V = MP/RT. Combining these equations gives an expression for dP/dh. where λ = RT/Mg ≈ 8 km. Solving this equation gives P = P0 exp(−Mgh/RT), and substitution in the equation of state gives C = C0 exp(−Mgh/RT) = C0 exp(−Ug/RT), which is the Boltzmann distribution. Thus, at low altitude, the potential energy is low, the concentration is high and the entropy per mole is low, and conversely for high altitudes.
Figure 4. This highly simplified diagram of a polymer, fixed at one end, illustrates how the entropy term in the Gibbs free energy contributes to a force. Force times change in length gives the work done in stretching the polymer, so we examine states with different lengths. In the all‐trans configuration, the polymer has a length L0, but there is only one configuration corresponding to that length, so the entropy associated with the internal bending is k ln 1 = 0. At a shorter length L, there are N possible configurations, of which only a few are shown. The entropy associated with this particular length is here k ln N > 0. Suppose that the extra potential energy associated with the cis bonds is ΔU, and neglect changes in the molecule's partial volume. The change in G is ΔG = ΔU − TSbend. We consider the case where TSbend > ΔU, where the shorter state is stable (has a lower G) with respect to the longer. In molecular terms, we would say that the Boltzmann distribution indicates that the ratio of the number of molecules with length L to that with length L0 is N exp(−ΔU/kT), which for this case is >1. Note that, when we increase the temperature, the average length decreases – which also happens when one heats a rubber band, and many other polymers. We could also calculate the (average) force exerted by this molecule, ∂G/∂L, and, from a collection of such molecules, we can find the elastic properties of this (idealised) polymer. Finally, note that the entropic term is more important as T increases, because T multiplies S. (In Boltzmann terms, more molecules are in the higher energy state.) This increasing effect with temperature is important for entropic forces, including for the hydrophobic interaction, discussed later in the text and See also: The Hydrophobic Effect
Figure 5. The Boltzmann distribution explains the equilibrium distribution across a membrane of permeating positive (+ve) and negative (−ve) ions. The electric potential energy per unit charge is called the electric potential or voltage (Φ). For a permeating cation, the left (low Φ) has lower energy, whereas for permeating anions, it is on the right. The highest entropy state would have equal concentrations of the permeating ions everywhere. Note that this distribution applies only to freely permeating molecules that are not actively pumped. In practice, the total number of positive charges inside a cell differs only by a very tiny fraction from the total number of negative charges. In terms of this diagram, many of the extra intracellular anions required to approach electroneutrality are found on non‐permeating species, including macromolecules, while the extracellular solution will also contain many cations that are expelled from the cell by active (energy‐using) processes.
Figure 6. A negatively charged surface in an ionic solution gives a non‐uniform ion distribution over a scale of nanometres (a). Positive ions are attracted towards the interface, negative ions are repelled. The layer of excess positive ions (called a double layer) counteracts the electrical effect of the charged surface at sufficiently large distance. (b) and (c) show how the electrical potential Φ and the ionic concentrations c+ and c vary near the surface, but approach their bulk concentration c0 at large distance x from the charged surface. The electric charge of ions has another effect that contributes to their distribution. The self‐energy or Born energy of an ion is the energy associated with the field of the ion itself. This depends on the medium and is lower in polar media such as water than in non‐polar media such as the hydrophobic regions of membranes or macromolecules. Thus, ions are found in only negligible concentration in the hydrocarbon interior of membranes, where their Born energy is very high. This energy is also thought to affect the distribution of ions inside aqueous pores or near protein molecules: if the ion is sufficiently close (∼nm) to a non‐polar region of membrane or macromolecule, it still has a high Born energy because its field penetrates into the non‐polar regions. The Born energy and the Boltzmann distribution mean that the concentration of ions of either sign is lower than would be otherwise expected, not only in non‐polar regions but also in the aqueous solution very near such regions. Again, the equilibrium distribution is a compromise between minimising the Born energy (ions keep away from non‐polar regions) and maximising entropy (ions distribute uniformly everywhere).
Figure 7. The membrane shown in (a) is permeable to water but not to solutes. Water flows towards the more concentrated solution (which has a lower water concentration). This flow stops and hydraulic equilibrium is achieved when the pressure P in the concentrated solution is sufficiently high. The system in (b) has a higher energy than (a), and it also has higher entropy. Entropy will be maximised if the membrane bursts and the solutions mix freely. Using the Boltzmann distribution with PV as the energy difference and making the approximations that solute volume and concentrations are small, we obtain the condition for osmotic equilibrium – the value of the hydrostatic pressure difference required to stop further flow of water into the solution side. In the case of a solution, the number fraction of water molecules, rather than their concentration, must be used in Eq. . Using the approximation that solute volume and concentration are small, one readily obtains an approximate expression for the equilibrium pressure difference, Posm ≅ kTcs = RTCs, where the solute concentration is cs in molecules m−3 or Cs in kmol m−3, respectively. Note that water concentration is usually much higher (∼50 kmol m−3) than solute concentrations, and so the proportional difference in water concentration is small. However, rates of diffusion and permeation depend on the absolute difference. A solution of 1 kmol m−3 or 1 mol L−1 of non‐dissociating solute gives an osmotic pressure of approximately 2 MPa, or 20 atm, the exact value depending on temperature and on some corrections due to solute–solvent interactions.
Figure 8. The surface free energy of water. In bulk solution, water molecules are transiently hydrogen bonded to their neighbours. A water molecule at an air–water surface or a hydrocarbon–air interface has fewer neighbours with which to form hydrogen bonds, so its energy is higher. Because molecules will tend to rotate to form bonds with their neighbours, the surface is more ordered and therefore has a lower entropy per molecule. Together, these effects give the water surface a relatively large free energy per unit area, which gives rise to the hydrophobic effect. The work done per unit area of new surface created is numerically equal to the surface tension γ, which is the force per unit length acting in the surface. This result is simply derived: if one expands the surface by drawing a length L a perpendicular distance dx, the force acting along the length L is γL, so the work done is γL dx = γ dA, where dA is the increased area.
Figure 9. The equilibria among a lipid monomer (a), a vesicle containing N molecules (b) and a macroscopic lipid bilayer membrane (c) shows the effect of molecular aggregation and mechanical stress.


Grønbech‐Jensen N, Mashl RJ, Bruinsma RF and Gelbart WM (1997) Counterion‐induced attraction between rigid polyelectrolytes. Physical Review Letters 78: 2477–2480.

Hobbie RK (1997) Intermediate Physics for Medicine and Biology. New York: Springer‐Verlag.

Israelachvili JN, Marčelja S and Horn RG (1980) Physical principles of membrane organisation. Quarterly Reviews of Biophysics 13: 121–200.

Landau DL and Lifschitz EM (1969) Statistical Physics (translated by Peierls E and Peierls RF). London, UK: Pergamon Press.

Morris CE and Homann U (2001) Cell surface area regulation and membrane tension. Journal of Membrane Biology 179: 79–102.

Parsegian VA (2000) Intermolecular forces. Biophysical Society. http://www.biophysics.org/btol/intermol.html

Slatyer RO (1967) Plant–Water Relationships. New York: Academic Press.

Wolfe J and Bryant G (1999) Freezing, drying and/or vitrification of membrane–solute–water systems. Cryobiology 39: 103–129.

Wolfe J and Steponkus PL (1983) Mechanical properties of the plasma membrane of isolated plant protoplasts. Plant Physiology 71: 276–285.

Further Reading

Babloyantz A (1986) Molecules, Dynamics, and Life an Introduction to Self‐Organization of Matter. New York: Wiley‐Interscience.

Coster HGL (1981) Thermodynamics of Life Processes. Sydney, Australia: New South Wales University Press.

Glansdorff P and Prigogine I (1971) Thermodynamic Theory of Structure, Stability and Fluctuations. New York: Wiley‐Interscience.

Nobel PS (1974) Introduction to Biophysical Plant Physiology. San Francisco, CA: Freeman.

Prigogine I and Nicolis G (1977) Self‐organization in Nonequilibrium Systems: from Dissipative Structures to Order through Fluctuations. New York: Wiley.

Contact Editor close
Submit a note to the editor about this article by filling in the form below.

* Required Field

How to Cite close
Wolfe, Joe(Feb 2015) Cellular Thermodynamics: The Molecular and Macroscopic Views. In: eLS. John Wiley & Sons Ltd, Chichester. http://www.els.net [doi: 10.1002/9780470015902.a0001363.pub2]