Figure 1. Thermal physics schematics of the Earth, a photosynthetic plant cell and an animal cell. Their macroscopically ordered structure – and
anything else they do or produce – is ultimately maintained or ‘paid for’ by a flux of energy in from the sun (at ∼6000 K)
and out to the night sky (at ∼3 K). The equal size of the arrows represents the near equality of average energy flux in and
out. The entropy flux is not equal: the rate of entropy export is in all cases much greater than the entropy intake. Apart
from exceptional ecosystems on the ocean floor that obtain their energy from geochemistry rather than from sunlight, all life
on Earth depends on photosynthesis and, thus, on this energy and entropy flux. The Earth absorbs solar energy at a rate of
about 5 × 1016 W, so the rate of entropy input is about 5 × 1016 W/6000 K ≅ 8 × 1012 W K−1. The Earth radiates heat at nearly the same rate, so its rate of entropy output is 5 × 1016 W/255 K ≅ 2 × 1014 W K−1. Thus, the biosphere creates entropy at a rate of nearly 2 × 1014 W K−1. Living things contribute only a fraction of this entropy but that is still a huge amount. Despite the disingenuous claims
of some anti‐evolutionists, biological processes are in no danger of violating the second law of thermodynamics.
Figure 2. At equilibrium, molecules and their energies are distributed according to the Boltzmann distribution, sketched here. In states,
phases or regions having different energies E, the concentrations or numbers of molecules are proportional to exp(−E/kT), where E is the energy per molecule, k = 1.38 × 10−23 J K−1 is the Boltzmann constant and T is the absolute temperature.
Figure 3. The variation with altitude of the concentration of gas in a gravitational field gives a simple derivation of the Boltzmann
distribution, if we assume constant temperature. Concentration is high at ground level (low gravitational potential energy)
and falls as the altitude h increases. The gravitational potential energy is Ug = Mgh per mole, where g is gravitational acceleration and M is the molar mass. Consider the volume element sketched, with thickness dh and area A: the pressure difference dP acting on area A supports the weight of the element (dP < 0). So AdP = −ρA dh g, where ρ is the density of the gas. The equation of state PV = nRT gives ρ = nM/V = MP/RT. Combining these equations gives an expression for dP/dh.
where λ = RT/Mg ≈ 8 km. Solving this equation gives P = P0 exp(−Mgh/RT), and substitution in the equation of state gives C = C0 exp(−Mgh/RT) = C0 exp(−Ug/RT), which is the Boltzmann distribution. Thus, at low altitude, the potential energy is low, the concentration is high and
the entropy per mole is low, and conversely for high altitudes.
Figure 4. This highly simplified diagram of a polymer, fixed at one end, illustrates how the entropy term in the Gibbs free energy contributes
to a force. Force times change in length gives the work done in stretching the polymer, so we examine states with different
lengths. In the all‐trans configuration, the polymer has a length L0, but there is only one configuration corresponding to that length, so the entropy associated with the internal bending is
k ln 1 = 0. At a shorter length L, there are N possible configurations, of which only a few are shown. The entropy associated with this particular length is here k ln N > 0. Suppose that the extra potential energy associated with the cis bonds is ΔU, and neglect changes in the molecule's partial volume. The change in G is ΔG = ΔU − TSbend. We consider the case where TSbend > ΔU, where the shorter state is stable (has a lower G) with respect to the longer. In molecular terms, we would say that the Boltzmann distribution indicates that the ratio of
the number of molecules with length L to that with length L0 is N exp(−ΔU/kT), which for this case is >1. Note that, when we increase the temperature, the average length decreases – which also happens
when one heats a rubber band, and many other polymers. We could also calculate the (average) force exerted by this molecule,
∂G/∂L, and, from a collection of such molecules, we can find the elastic properties of this (idealised) polymer. Finally, note
that the entropic term is more important as T increases, because T multiplies S. (In Boltzmann terms, more molecules are in the higher energy state.) This increasing effect with temperature is important
for entropic forces, including for the hydrophobic interaction, discussed later in the text and See also: The Hydrophobic Effect
Figure 5. The Boltzmann distribution explains the equilibrium distribution across a membrane of permeating positive (+ve) and negative
(−ve) ions. The electric potential energy per unit charge is called the electric potential or voltage (Φ). For a permeating
cation, the left (low Φ) has lower energy, whereas for permeating anions, it is on the right. The highest entropy state would
have equal concentrations of the permeating ions everywhere. Note that this distribution applies only to freely permeating
molecules that are not actively pumped. In practice, the total number of positive charges inside a cell differs only by a
very tiny fraction from the total number of negative charges. In terms of this diagram, many of the extra intracellular anions
required to approach electroneutrality are found on non‐permeating species, including macromolecules, while the extracellular
solution will also contain many cations that are expelled from the cell by active (energy‐using) processes.
Figure 6. A negatively charged surface in an ionic solution gives a non‐uniform ion distribution over a scale of nanometres (a). Positive
ions are attracted towards the interface, negative ions are repelled. The layer of excess positive ions (called a double layer)
counteracts the electrical effect of the charged surface at sufficiently large distance. (b) and (c) show how the electrical
potential Φ and the ionic concentrations c+ and c− vary near the surface, but approach their bulk concentration c0 at large distance x from the charged surface. The electric charge of ions has another effect that contributes to their distribution. The self‐energy
or Born energy of an ion is the energy associated with the field of the ion itself. This depends on the medium and is lower
in polar media such as water than in non‐polar media such as the hydrophobic regions of membranes or macromolecules. Thus,
ions are found in only negligible concentration in the hydrocarbon interior of membranes, where their Born energy is very
high. This energy is also thought to affect the distribution of ions inside aqueous pores or near protein molecules: if the
ion is sufficiently close (∼nm) to a non‐polar region of membrane or macromolecule, it still has a high Born energy because
its field penetrates into the non‐polar regions. The Born energy and the Boltzmann distribution mean that the concentration
of ions of either sign is lower than would be otherwise expected, not only in non‐polar regions but also in the aqueous solution
very near such regions. Again, the equilibrium distribution is a compromise between minimising the Born energy (ions keep
away from non‐polar regions) and maximising entropy (ions distribute uniformly everywhere).
Figure 7. The membrane shown in (a) is permeable to water but not to solutes. Water flows towards the more concentrated solution (which
has a lower water concentration). This flow stops and hydraulic equilibrium is achieved when the pressure P in the concentrated solution is sufficiently high. The system in (b) has a higher energy than (a), and it also has higher
entropy. Entropy will be maximised if the membrane bursts and the solutions mix freely. Using the Boltzmann distribution with
PV as the energy difference and making the approximations that solute volume and concentrations are small, we obtain the condition
for osmotic equilibrium – the value of the hydrostatic pressure difference required to stop further flow of water into the
solution side. In the case of a solution, the number fraction of water molecules, rather than their concentration, must be
used in Eq. . Using the approximation that solute volume and concentration are small, one readily obtains an approximate expression for
the equilibrium pressure difference, Posm ≅ kTcs = RTCs, where the solute concentration is cs in molecules m−3 or Cs in kmol m−3, respectively. Note that water concentration is usually much higher (∼50 kmol m−3) than solute concentrations, and so the proportional difference in water concentration is small. However, rates of diffusion
and permeation depend on the absolute difference. A solution of 1 kmol m−3 or 1 mol L−1 of non‐dissociating solute gives an osmotic pressure of approximately 2 MPa, or 20 atm, the exact value depending on temperature
and on some corrections due to solute–solvent interactions.
Figure 8. The surface free energy of water. In bulk solution, water molecules are transiently hydrogen bonded to their neighbours. A
water molecule at an air–water surface or a hydrocarbon–air interface has fewer neighbours with which to form hydrogen bonds,
so its energy is higher. Because molecules will tend to rotate to form bonds with their neighbours, the surface is more ordered
and therefore has a lower entropy per molecule. Together, these effects give the water surface a relatively large free energy
per unit area, which gives rise to the hydrophobic effect. The work done per unit area of new surface created is numerically
equal to the surface tension γ, which is the force per unit length acting in the surface. This result is simply derived: if one expands the surface by drawing
a length L a perpendicular distance dx, the force acting along the length L is γL, so the work done is γL dx = γ dA, where dA is the increased area.
Figure 9. The equilibria among a lipid monomer (a), a vesicle containing N molecules (b) and a macroscopic lipid bilayer membrane (c) shows the effect of molecular aggregation and mechanical stress.