Side Comments on Miller

I am quoting what seems to be the relevant portion of the first paper shared by @bjmiller. Maybe it will help in moving the discussion forward.

Living cells are dissipative, open, and far-from-equilibrium systems that lower the entropy utilizing an influx of energy and molecular material in a multi-compartment structure with specific functional characteristics. Entropy reduction was discussed early on by Schrödinger (1967) and it relies on both energy supply to create a metastable non-equilibrium state and electrical, pressure and chemical potential gradients across semi-permeable membranes. Electric potential differences also assist in the process. As an open system, a cell operates cyclically exchanging material and heat with the environment. High-energy molecules are absorbed through pores in the membrane and their energy used to synthesize components of the cell and maintain ambient temperature. Heat is dissipated and waste products excreted so that excess entropy in the environment is balanced by structure- and information-production lowering the entropy inside the cell. This, of course, leads to a net entropy change in the cell fluctuating quasi-periodically close to the zero value. Cell death would manifest itself in the breakdown of structures and functions leading to a continuous entropy production as governed by the second law of thermodynamics. Overall, the entropy changes in the cell can be attributed to four distinct processes: (a) chemical reactions leading to the aggregation of molecules, (b) mass transport in and out of the cell leading to concentration gradients across the membrane, © heat generation due to metabolism of the cell, and (d) information stored in terms of genetic code in both nuclear and mitochondrial DNA. Morowitz (1955) estimated that approximately 2 × 1011 bits of information are contained in the structure of Escherichia coli bacteria, the simplest and best documented organism, a number which agrees with calorimetric data (Gilbert, 1966). However, the estimated information capacity in the E. coli ’s genome is only 107 (Johnson, 1970), which is at first surprising but on closer examination, to be expected, as argued below.

Living cells, as all matter, must obey the energy conservation principle, which takes the form of the first law of thermodynamics

dU = DQ + DW .


In the thermodynamic sense, cells can be viewed as machines, similar to a combustion engine engaged in a Carnot cycle, performing work and generating heat, requiring constant supply of energy and matter (i.e. energy-giving molecules like glucose). A more appropriate formulation of the energy balance is through the Gibbs free energy that accounts for a change in the numbers of molecules and the presence of several molecular species:

G = UTS + PV = μ N or dG = − SdT + VdP + μ dN .


Hence, the entropy differential can be written as



which indicates that entropy changes can be achieved through heat production, change of volume or a flux of molecules.

Since the entropy of an ideal gas of N particles with total energy E , of mass m each, is (Landau and Lifshitz, 1969)



this means that confining molecules within space, as is the case with building a cellular structure, reduces the exploration volume V , and thus reduces the entropy of the system accordingly. Conversely, mixing two molecular species with numbers N 1 and N 2 in a fixed volume V by opening a partition between their compartments V 1 and V 2 increases the entropy by the amount given below:



Therefore, keeping various molecular species separated in individual compartments (including the mitochondria, the nucleus, the endoplasmic reticulum, etc.) is another entropy reducing process. While the above equations strictly speaking apply to equilibrium situations, an assembled structure of the cell from its building blocks, by and large stays in its morphological state except for mitosis and continuous material transport which can be regarded as a second order correction.

@Art ; The above quoted portion might be interesting to you. This is not the only paper which makes the claim that Cells have lower entropy than their surroundings. I have come across other papers on the OOL which treat the cell as having lower entropy than the surroundings. It seems a fairly uncontroversial claim.

Can you share a more detailed version of your argument?

Ashwin, I do not believe a more detailed explanation is necessary. Once one takes into account everything (including solvent and solutes), it becomes quite apparent that living cells are massive generators of entropy. The snippet you provide really doesn’t change that.

I am thinking that the engineers and physicists here are enamored with equations and the like, hence the favorable reception for Brian’s essays. But, as is the case with engineering and physics, if one doesn’t account for everything in a project or a model, then all the equations in the world cannot rescue what is a failed endeavor.

All that being said, check out Table 1 in this paper. I am not just making this stuff up. (The rest of the paper reinforces all of this. Also, there is nothing special about the paper - it is the first I came up with that had some numbers in a fairly accessible form.) That hydrophobic interactions are entropy driven is (I thought) widely well-known and really uncontroversial.


@it is easy for Christians to accept design. It is exceeding the scope of what science does to think there will ever be science that confirms “something is without question designed (miraculously).”

Even though ID refuses to spell out the identity of the designer… as soon as anyone attempts to take ID science into the realm of the miraculous… the science has to stand at the boundary of the miraculous and wave good bye to the sojourners within!

@gbrooks9, please don’t shout (huge font), it is often interpreted as rudeness. Thanks.