So, I’ve reviewed some of this book, and here is the issue I’m seeing. Perakh’s discussion in Why Intelligent Design Fails (Young & Edis 2006, ch. 6) treats the openness of living systems; continuous energy and matter exchange with the environment, as sufficient to permit “local decreases in entropy” and therefore the spontaneous rise of “information.” This reasoning conflates thermodynamic entropy, a measure of energy dispersion (S = k ln Ω), with Shannon informational entropy, a measure of signal uncertainty (H = –Σ p log p). While an open thermodynamic system can maintain or even increase physical order through energy throughput, there is no empirical or theoretical corollary by which energy flow alone can create or preserve functional information. In fact, all evidence indicates the opposite: unless an informational system is specifically designed to withstand the noise induced by the increase or decrease of energy—through coding, redundancy, and error-correction mechanisms—energy fluctuations invariably accelerate the entropic (Shannon) degradation of stored information.
What is remarkable in biology is that genetic information endures massive environmental flux: genomes survive asteroid impacts, volcanic winters, intense cold, heat, radiation, and chemical perturbation. Such resilience strongly implies a pre-engineered robustness within the informational architecture itself—error-correction, repair, and redundancy mechanisms tuned to withstand vast energetic disturbance. Far from suggesting spontaneous origin, the capacity of the genetic system to remain coherent through planetary-scale catastrophes underscores the sophistication of its design.
We have no evidence whatsoever that a data-storage and replication system of comparable durability, reliability, and self-repair could ever arise by undirected physical processes.
As Brillouin cautioned, the analogy between entropy and information “must not be taken for identity” (Science and Information Theory, 1956, p. 12). Yockey demonstrated that “the genetic code is not an analogy but an exact communication system” and cannot arise by thermodynamic ordering alone (Information Theory and Molecular Biology, 1992, p. 313). More recently, Walker and Davies (2013, J. R. Soc. Interface 10:20120869) emphasized that the origin of life is “algorithmic, not thermodynamic”: living systems persist by maintaining and transmitting semantic information against noise, not by merely exporting heat. Thus, while Perakh correctly observes that open systems can exhibit local negentropy, we possess no evidence that such systems can originate or sustain informational order without prior intelligent or algorithmic design; empirically, unstructured energy tends to destroy informational coherence rather than generate it.
Over 400+ million years, any theory which relies on unguided creation of such code, still existing, after trillions and trillions of copies of copies, enduring massive cataclysmic events and still functional, seems absurd on its face.