On the roles of function and selection in evolving systems

I came across this odd, and oddly grandiloquent-sounding, article via slashdot.org:



The universe is replete with complex evolving systems, but the existing macroscopic physical laws do not seem to adequately describe these systems. Recognizing that the identification of conceptual equivalencies among disparate phenomena were foundational to developing previous laws of nature, we approach a potential “missing law” by looking for equivalencies among evolving systems. We suggest that all evolving systems—including but not limited to life—are composed of diverse components that can combine into configurational states that are then selected for or against based on function. We then identify the fundamental sources of selection—static persistence, dynamic persistence, and novelty generation—and propose a time-asymmetric law that states that the functional information of a system will increase over time when subjected to selection for function(s).


Physical laws—such as the laws of motion, gravity, electromagnetism, and thermodynamics—codify the general behavior of varied macroscopic natural systems across space and time. We propose that an additional, hitherto-unarticulated law is required to characterize familiar macroscopic phenomena of our complex, evolving universe. An important feature of the classical laws of physics is the conceptual equivalence of specific characteristics shared by an extensive, seemingly diverse body of natural phenomena. Identifying potential equivalencies among disparate phenomena—for example, falling apples and orbiting moons or hot objects and compressed springs—has been instrumental in advancing the scientific understanding of our world through the articulation of laws of nature. A pervasive wonder of the natural world is the evolution of varied systems, including stars, minerals, atmospheres, and life. These evolving systems appear to be conceptually equivalent in that they display three notable attributes: 1) They form from numerous components that have the potential to adopt combinatorially vast numbers of different configurations; 2) processes exist that generate numerous different configurations; and 3) configurations are preferentially selected based on function. We identify universal concepts of selection—static persistence, dynamic persistence, and novelty generation—that underpin function and drive systems to evolve through the exchange of information between the environment and the system. Accordingly, we propose a “law of increasing functional information”: The functional information of a system will increase (i.e., the system will evolve) if many different configurations of the system undergo selection for one or more functions.

Pop-sci description:

The authors appear to be a mix of space/earth scientists and philosophers. Lead author:


I thought the biologists here might be amused to discover that there’s a new law in town, that their field is apparently now subject to, and would be curious what they make of it. :wink:


They forgot to define “functional information”. Or “evolve”. Or “selection”. Or “function”. Aside from that, the new law is quite clear.

Functional Information and the Evolution of Systems.

All of the natural laws in Table 1 involve a quantitative parameter such as mass, energy, force, or acceleration. Is there an equivalent parameter associated with evolving systems? We suggest that the answer is information (measured in bits), specifically “functional information” as introduced by Szostak and coworkers (28, 8789).

Hazen is a co-author on the 2007 where he, Griffin, Szostak, and Carothers define functional information.

The problem with their paper, that I can tell, isn’t that it lacks definitions. It’s that the central conjecture is already known empirically to be false. Selection does not tend to increase complexity. There is no such overall tendency for selection. It can, but often times doesn’t. It’s context specific.

1 Like

On a related note, I’m glad they are making what we’ve all been trying to explain to creationists abusing FI, explicit:

A significant limitation of the functional information formalism is the difficulty in calculating I (Ex ) for most systems of interest. Functional information is a context-dependent statistical property of a system of many different agent configurations: I (Ex ) only has meaning with respect to each specific function. To quantify the functional information of any given configuration with respect to the function of interest, we need to know the distribution of Ex for all possible system configurations relevant to the domain of interest. Determination of functional information, therefore, requires a comprehensive understanding of the system’s agents, their interactions, the diversity of configurations, and the resulting functions. Functional information analysis is thus not currently feasible for most complex evolving systems because of the combinatorial richness of configuration space. Even if we could analyze a specific instance where one configuration enables a function, we cannot generally know whether other solutions of equal or greater function might exist in configuration space (13).

In other words: Gpuccio’s method doesn’t work as people like @Giltil mistakenly believe, and a limited sequence alignment doesn’t, in fact can’t remotely be used to estimate functional information.

1 Like

Selection, in the paper, is supposed to increase functional information. Where did complexity enter into that? All they seem to say is that selection increases fitness. Is that worth publication in PNAS?

1 Like

This is certainly a much better paper than the Assembly Theory mess. It may be hard to calculate, but has a very specific interpretation, which is a positive.