Is There Evidence for a Universal Common Designer?


Due to disputing new data currently being reported in multiple fields, a different perception of evolution is beginning to manifest, wherein the processes underlying organismal development are recognized as causes of evolution. New questions are being raised in light of this novel evidence. Should evolution be considered a truly random process or a directed one? Do all living organisms share a common ancestor or a common evolutionary mechanism?

Based on previous studies and available literature, herein, I demonstrate how the quantum mind theory and universal common design/archetype are well-supported theories. I combine the two common explanations for life into one theory called the “universal common designer theory.” Moreover, I provide a model describing the nature of this designer in more detail, which can potentially predict biological phenomena. I then provide a method to test for the existence of a universal common designer in nature by showing that a universal common design defines evolution. Finally, I address several scientific objections revolving around the dysteleological argument that have been relegated against the theory, providing a sound methodology to further test this idea.

Keywords: common design, Objective Reduction,


As Ayala [1] clarified, evolution by natural selection explains natural designs without a designer. For instance, when scientists use the term “random” to describe mutations, they refer to the unintentional nature of the process; mutations do not “attempt” to supply what the organism “needs” in a given moment or place. Instead, environmental factors influence only the rate, not the course, of mutations. For example, contact with harmful chemicals may increase mutation rates but will not increase beneficial mutations that render an organism resistant to those chemicals. In this sense, mutations are considered random because no “conscious” intent is involved, suggesting that there is no personal agent selecting adaptive combinations in evolution. Ayala further explains how this description of mutations essentially forms the basis of the Modern Synthesis theory, which was proposed between 1936 and 1947, reflecting the consensus on how evolution proceeds. Nineteenth-century evolutionary ideas by Charles Darwin, Gregor Mendel, and others were expanded upon by researchers studying population genetics between 1918 and 1932, who incentivized the Modern Synthesis theory by showcasing that Mendelian genetics were consistent with natural selection and gradualism. [2]
Nevertheless, disputing new data are currently being reported in multiple fields, and thus, a different perception of evolution is beginning to manifest, wherein the processes by which organisms develop are recognized as causes of evolution. [2] For instance, analysis of the genomes of 46 sequenced isolates of Escherichia coli provided a statistically supported comparison of the phylogenetic tree topologies for regulatory regions, their regulated genes, and vertical inheritance. The results of this comparison highlight the notion that the evolution of regulatory regions of over half of the core genes (i.e., genes shared by all isolates) was incongruent with that of vertical inheritance. [3]
Moreover, Martincorena et al. [4] similarly proposed that observations suggest that the mutation rate has been evolutionarily optimized to reduce the risk of deleterious mutations. However, current knowledge of factors influencing the mutation rate does not explain these observations, suggesting that other mechanisms are likely involved.
New questions are being raised in light of this novel evidence. Should evolution be considered a truly random process or a directed one? Do all living organisms share a common ancestor or a common evolutionary mechanism?
Based on previous studies and available literature, herein, I aim to demonstrate how the quantum mind theory is well-supported and can potentially predict biological phenomena. Moreover, I aim to substantiate the existence of a universal common designer in nature by showing that a universal common design underlies evolution. Previous attempts to reintroduce an intelligent designer into science by intelligent design (ID) theorists have not gained traction for several reasons.
ID theorists argue that the very presence of complex specified information (CSI) found in DNA automatically provides empirical support for the claim that an intelligent designer created and designed life because only human designers can produce CSI based on uniform experience. ID theorists attempt to provide examples of irreducible complexity or specified information in nature to establish that an intelligent designer devised all living organisms. This involves demonstrating how removing one part of a complex design, such as an eye, would cause the entire system to cease functioning. However, human designers are finite and fallible beings that design things based on limited prior knowledge using and modifying preexisting material, which would merely mimic Neo-Darwinian mechanisms. More importantly, ID theorists have not yet attempted to prove or explore the nature of this designer.
Unlike most ID theorists, Roger Penrose and Stuart Hammeroff have explored the nature of this intelligent designer through a gravity-induced self-collapse, which they refer to as an “objective reduction.” They provided a comprehensive model of the nature and mechanism of this conscious agent that can explain the origin and evolution of life, species, and consciousness. [5] However, their theory does not go on to differentiate this conscious agent from mindless forces. For instance, even if the proposed experiments confirmed Penrose’s prediction, all it would prove is that non-biological settings display elements that mirror conscious behavior. However, this could be considered anthropomorphism.
The Extended Modern Synthesis theory, however, holds two key assumptions [6]: (1) mutations are a random process, and (2) all living organisms share a common ancestor. This article considers both assumptions while primarily focusing on the latter. Here, I show how the universal common archetype/design theory first proposed by Richard Owen, a predecessor of Darwin, can explain and predict how biological processes on Earth developed over time. I then propose an updated version of the Modern Synthesis theory. [7] Finally, I provide a model describing the nature of this designer in more detail, creating a synthesis between the quantum mind theory and the universal common design/archetype theory.

Quantum mind theory

According to Roger Penrose, the action of consciousness proceeds in a way that cannot be described by algorithmic processes. [8] For instance, conscious contemplation can ascertain the truth of a statement and freely make intellectual and moral judgments. This involves distinguishing between true and false statements or what is morally “right” versus “wrong.”

The only thing in nature that does this is a wave-function collapse. For instance, at small scales, quantum particles simultaneously exist in the superposition of multiple states or locations, described by a quantum wave function. However, these superpositions are not seen in our everyday world because efforts to measure or observe them seemingly result in their collapse to definite states. [5] Why quantum superpositions are not seen is a mystery known as the measurement problem, which seems somewhat related to consciousness. Experiments from the early 20th century indicated that conscious observation caused superposition wave functions to collapse to definite states, choosing a particular reality. Consciousness was said to collapse the wave function under this view. [5]

Moreover, Diederik Aierts [9] demonstrated how these two phenomena are identical by applying the quantum theory to model cognitive processes, such as information processing by the human brain, language, decision-making, human memory, concepts and conceptual reasoning, human judgment, and perception. Owing to its increasing empirical success, quantum cognition theory has been shown to imply that we have quantum minds.

Other empirical data have shown that the brain is a quantum computer that uses quantum mechanical processes, such as quantum tunneling and superposition, [10, 11] explicitly suggesting that we have quantum minds, as the Orch-OR theory predicted (Read section 4.5 OR and Orch-OR of “Consciousness in the universe” by Hammeroff and Penrose for more details). [12]

Lastly, observations and experiments on the fine-tuning constants seem to support an aspect of quantum mind theory called the universal proto-consciousness field theory. This field theory has also been referred to, by Penrose, as objective reduction (OR) and incorporated in his Orch-OR model to explain why humans have consciousness and these fine-tuning constants.

For the sake of clarity, quantum mind theory does not advocate for dualism or an additional supernatural force/substance that would operate outside the rules of science. Instead, it advocates for consciousness as an essential ingredient of physical laws that science has not yet fully understood. For more details, please refer to the introduction of “Consciousness in the universe” by Hammeroff and Penrose. [12]

In the next section, I provide a model describing the nature of this universal proto-consciousness in more detail using fine-tuning constants related to life and advanced life. However, Penrose’s highly speculative gravity-induced collapse model will not be discussed or further developed because it is beyond the scope of this article.

Definition: universal proto-consciousness, the universal self-collapsing wave function.

Universal proto-consciousness field

According to the eternal inflationary theory, the acceleration of the expanding universe is thought to result from an explosion or collision of quantum fluctuations of particles—called the “cosmological constant”—that permeate the entire multi-verse where a billion (plus one) positive particles and a billion negative particles come into existence at once.

The cosmological constant is placed at a precise measurement of 10 to the 120th power. [13] When scientists calculate the rate of expansion back one second after the Planck scale of the early universe, the value is revealed to be an astounding 10 to the 10 to the 123rd power.

Furthermore, using the Planck scale Wilkinson Microwave Anisotropy Probe (WMAP), researchers have demonstrated that the fine-structure constant in physics has remained fixed over the universe’s history. [14] For the first time, a team of five physicists led by Nathan Leefer has confirmed the constancy of the fine-structure constant to the entire geographical extent of the universe.

This limits constraints on models of dark energy that invoke rolling scalar fields and curtails the parameter space of supersymmetric or string theory physics models. [15]
Although the Planck satellite results do not yet rule out all these models of dark energy that show change over time, there is no evidence that the cosmological constant varied, and we now have evidence suggesting it was probably constant throughout time and space.

Hypothetically, this indicates that if our universe’s expansion rate had different values with larger amounts of dark energy, the universe created during the expansion that formed planets and stars, where life of any kind might evolve, would have most likely blown apart the cosmic material instead. If our universe’s expansion rate had different values with smaller amounts of dark energy, the universe created during the expansion would most likely have collapsed back into a singularity before it ever reached its present size.

This sort of fine-tuning plays a role in biology as well. For instance, quantum tunneling needs to be extremely precise for hemoglobin to transport the right amount of oxygen to the cells of all vertebrate and most invertebrate species. [16]

For instance, if the conscious observer chooses to measure the momentum of a particle with precision, the observer discovers that the position of the particle is now known only approximately ± half a mile (for more details on this, please refer to inference # 5 below). However, according to the Heisenberg principle, the more precisely the position of some particle is determined, the less precisely its momentum can be predicted from initial conditions, and vice versa. If the uncertainty in the position becomes much greater or smaller than half a mile, hemoglobin will not function as it does, rendering advanced life impossible. [17]

This means that, despite the Heisenberg principle being a random process, the uncertainty in the Heisenberg uncertainty principle must be fine-tuned. Overall, if we combine all the evidence, this would reveal that the universal proto-consciousness has the ability to do these things:

  1. Is causally active everywhere (i.e., omnipresent).

There are two types of causal interactions in nature: antecedent causation and simultaneous causation. Antecedent causation is where every event precedes another in time because they are material causes that operate under the principles of classical physics.

In contrast, simultaneous causation occurs where the cause exists with the effect inside the same event because they are non-local causes that operate under the principles of quantum physics. We have experimental proof that the “choice of measurement in one lab really causes a change in the local quantum state in the other lab.” [18]

  1. Knows the position and momentum of every particle in the universe at a particular moment in time according to the universal wave function (i.e., omniscience).

The universal wave function is the “basic physical entity” representing the “totality of existence” or “the fundamental entity, obeying at all times a deterministic wave equation.” Each wave function- is a mathematical configuration of matter or universe and has been confirmed to exist as an objective part of reality. [19]

Furthermore, according to the superposition principle in quantum mechanics, wave functions can be added together and multiplied by complex numbers to form new wave functions. The Schrödinger equation determines how wave functions evolve over time, and a wave function behaves qualitatively like other waves, such as water waves or waves on a string, because the Schrödinger equation is mathematically a type of wave equation, explaining the name “wave function.”

  1. Has the power to accelerate our universe forever (i.e., omnipotent)

If the topology of the universe is open or flat, or if dark energy is a positive cosmological constant (both consistent with current data), the universe will continue expanding forever.

Moreover, the scientific definition of “Power” is the rate at which energy is transferred, used, or transformed; it is not synonymous with energy. Therefore, the cosmological constant or vacuum energy is not an infinite amount of energy; instead, it is a very precise cancellation effect between negative and positive energy particles or fluctuations.

  1. Does not have defined locations in space and time (i.e., eternal);

Things that do not have defined locations in space and time do not exist in classical space and time according to the classical definition of space-time. Thus, as classical space and time have been shown to be interwoven concepts and matter/energy cannot exist without space-time, something that is considered tenseless would have to be understood as “non-local,” which is not classical space-time involving general relativity.

This is referred to as quantum non-locality, where each particle moves in a so-called superposition of different velocities simultaneously. This has been confirmed to be a universal property of the world, regardless of how and at what speed quantum particles move. [20]

  1. Intended for advanced life to survive, reproduce, and pioneer different environments (i.e., personal)

Quantum physics experiments have revealed no concrete physical reality made of classical space-time constituents. Instead, the so-called material realm actually exists in a super-positional state of all quantum possibilities (i.e., wave functions) that are mathematical in nature.

The intangible phenomenon of conscious observership is the only means capable of “collapsing” any given combination of quantum wave functions, which imparts a concrete and physical reality to them: “No naive realistic picture is compatible with our results because whether a quantum could be seen as showing particle-like or wave-like behavior would depend on a causally disconnected choice.” [21]

This shows how the right fine-tuning values were carefully chosen to allow advanced life to exist from the beginning leading up to the present.

  1. Intends all possible kinds of life to potentially survive, reproduce, and pioneer different universes (i.e., omnibenevolent)

Every living creature on Earth uses the same code, in which DNA stores information using four nucleotide bases. The sequences of nucleotides encode information for constructing proteins from an alphabet of 20 amino acids. Why were these numbers chosen rather than some other numbers?

Patel showed how quantum search algorithms explain why these numbers were chosen. [22] To summarize, if the search processes involved in assembling DNA and proteins are to be as efficient as possible, the number of bases should be four, and the number of amino acids should be 20.

In other words, to address a common set of problems faced by organisms possessing different characteristics and living in different habitats, a single optimal solution must be employed. This means that if we replay the evolutionary history of life, it would lead to identical or nearly identical outcomes. An experiment has revealed that this quantum search algorithm is itself a fundamental property of nature. [23]

  1. Must exist in all possible worlds or universes to create and sustain life (i.e., necessary).

Observations show that dark energy is consistent with general relativity and affects the expansion of the universe, [24] suggesting that the cosmologic constant applies to the smaller universes created within the regions of our universe as well because general relativity predicted and confirmed the existence of gravitational waves, which supports the eternal inflationary theory.

Empirical support for model

Support for this model has come from confirmed predictions that show non-random mutations. For instance, as Mattick and Dinger indicated, it has long been argued that the presence of non-protein coding or so-called ‘junk DNA’ that comprises more than 90% of the human genome is evidence for the accumulation of evolutionary debris by blind Darwinian evolution, which would argue against intelligent design, as an intelligent designer would presumably not fill the human genetic instruction set with meaningless information.

This objection has been essentially refuted in the face of growing functional indices of noncoding regions of the genome, with the latter reciprocally used in support of the notion of ID and to challenge the conception that natural selection accounts for the existence of complex organisms. In fact, it has been shown that well over 80% of junk DNA is functional. [25] However, there is controversy surrounding these results. I will direct readers to an article that addresses and responds to the objections and controversy. [26]
The vast majority of mutations in regions that do encode proteins are deleterious and prevent beneficial mutations from being fixed within the population. Nonetheless, in a study on 34 E. coli strains, Martincorena, Seshasayee, & Luscombe [4] discovered that the mutation frequency varies across bacterial genomes. Some regional “hot spots” have a reasonably high mutation rate, whereas “cold spots” display a reasonably low rate of genetic change. The researchers discovered that the hot- and cold-spot locations are not random. [4] Thus, it appears that the mutation rates have been fine-tuned to lower the risk of harmful genetic changes. Recent studies have drawn the same conclusion. [27, 28]
A similar study suggested that mutations are guided by both the physical properties of the genetic code and the need to preserve critical protein function. [29] The authors analyzed mitochondrial genomes in several species and found numerous positively selected sites where DNA changes allowed the species to adapt to its environment. They hypothesized that the cell might make mistakes copying repeated sequences during DNA replication. While the cell is “fixing the buttons,” the DNA has more time to mutate.
These repeats influence the mutation rate, as mutations in the repeats on either side of the mutated DNA would abrogate protein function, preventing them from being eliminated and resulting in a mutational ‘hot spot’ between stable DNA sequences. Furthermore, the authors found that 97% of the sites were under positive selection, and 60% of all mutated sites were near repetitive sequences. [29]
Thus, all this evidence suggests that the mutation rates have been fine-tuned to lower the risk of harmful genetic changes. This shows a direct correspondence with the model about the creator using fine-tuning principles of electron tunneling in the construction of advanced life. [16] In the next section, I will highlight more observations that further reveal the nature of this being.

Universal common designer theory

DNA contains two types of information: digital, represented by genetic information, and analog, represented by the genome; both are present in DNA and have many properties identical to those in man-made computers and linguistic texts.

For example, logic gates convert an input signal into an output, according to modern computers. Typically, these outputs are AND, OR, and YES. Using the output of one logic gate as the input to another, computer experts can develop sophisticated networks of gates to compute answers to complex questions.

Researchers noticed that the beginning of DNA replication also operated as a logic gate at the molecular level. They were able to modify the biochemical systems involved in DNA replication to input AND, OR, and YES logic gates that could respond to chemical inputs provided by scientists in a laboratory setting. [30] Another example is the genome of the bacterium E. coli and how it functions like the Linux computer operating system (OS), as revealed by other researchers. [31]

Moreover, the information content of proteins bears almost the same mathematical structure as human language. This involves discrete mathematics or statements of logic and language that humans use to communicate with each other every day, such as phrases, signs, and symbols that are meaningful and personal. [32, 33]

Of course, some scientists insist that the comparisons between DNA information and human information are merely used in a metaphorical sense. [34] This is contradicted by biotechnologists, who have shown that the protein machines that operate on DNA during processes, such as transcription, replication, and repair, operate very similarly to a computer system. In fact, the similarity is so remarkable that this insight has spawned a new area of nanotechnology called DNA computing. [35]

Considering these additional data, this universal proto-consciousness probably operated in a similar manner to humans when designing life on Earth. This means that we would not have to worry about using an unfalsifiable theory that involves an omnipotent human because a non-computable being cannot violate his own nature (see Appendix on why God cannot violate his own nature).

In other words, the non-computable trait that this designer possesses offsets the omnipotent trait that this designer would also have to possess if this is true. For this reason, we would expect God’s human nature to be consistent without the flaws that humans naturally have because of their inherent physical limitations.

This allows us to treat an omnipotent God in the same way we would treat other intelligent agents (Neanderthals, modern humans, extraterrestrial intelligence, etc.) when we want to use a valid cause to explain a biological phenomenon over a mindless force.

Thus, all candidates are considered natural but immaterial causes that we can test because consciousness is supposed to be fundamental physics not classical physics.

If this theory is true, we would expect all extant organisms to have a common design that can be traced back to this universal common designer.

To be continued…

Good grief.


Universal Common Design theory

This common designer implies having a common design rather than a common descent because only humans produce top-down causation in the form of algorithmic information or RNA viruses. More importantly, observations have suggested that viruses were not only the probable precursors of the first cells but also helped shape and build the genomes of all species, including humans. [36]

For instance, scientists synthesized the RNA molecules of a virus and reconstructed a virus particle from scratch. [37] They accomplished this by creating another virus and using its parts, such as specialized proteins (enzymes), to construct an RNA virus to solve the problem of an unstable RNA. Other experiments have shown that RNA viruses can be engineered to interact with the host miRNA pathways, and miRNAs can be used to control viral tropism. [38]

For example, endogenous retroviruses (ERVs) protect the host cell’s genome from retroviral infections by disrupting the endogenization process of invading retroviruses. ERVs must resemble retroviruses to act as a defense mechanism against incoming harmful viruses. [39]

A different study from Kazuaki Monde et al. also revealed that “the strict dependence of HERV-K on SOX-2 has allowed HERV-K to protect early embryos during evolution while limiting the potentially harmful effects of HERV-K retrotransposition on host genome integrity in these early embryos.” [40]

This is how human designers operate. They use preexisting mechanisms, material parts, and digital information to assemble designs to achieve a purpose.

The other reason a common designer implies having a common design rather than a common descent is because natural selection lacks the capacity to elucidate the physical mechanisms underlying the transition from non-life to life or to distinguish non-living from living. [41]

Furthermore, RNA viruses cannot be included in the tree of life because they do not share characteristics with cells, and no single gene is shared by all viruses or viral lineages. While cellular life has a single, common origin, viruses are polyphyletic—they have many evolutionary origins. [42]

Overall, this is why we can infer that all living animals share a common design that can be traced back to a universal common designer. If this theory is true, then we can expect humans to possess cognitive qualities that are exceptions to what would otherwise be predicted if we were evolutionary descendants, such as

KIF18a [also known as (a.k.a.) Kinesin 8]

KNL1 (a.k.a. CASC5)

SPAG5 (a.k.a. astrin)

Endorestiform Nucleus

These examples involve the cognitive abilities of the human brain that either have not been observed to be present in animal brains or do not work properly in animal brains through experimentation.


Common design: To create and develop animals through the common process of HRT for the common purpose of surviving, reproducing, and pioneering different environments.

Universal common designer: universal self-collapsing genetic code shown by the shared DNA among all living organisms (i.e., objective reduction).

Created kinds: A recognizable base form and structure that does not change over time (Breeds within a kind). They are separate, unique (no common ancestors), and fully functional (no trial and error). Similar in form/design/ variable surface features owing to the similarity in function and common designer. It can produce many species, has wide genetic diversity, and can acclimate to the environment. [43] Example: 196 Bird kinds, Wolf kind, Elephant Kind.

Species: A similar base form and similar surface features that do change over time, such as size and color (breeds within a species). Represents the entire group related by common ancestry from present generations. It can produce many varieties, has narrow genetic diversity, and is acclimated to the environment. [43] Example: over 10,000 bird species today; Wolf/Dog/Fox; Asian and African elephants/Mastodon/Mammoth.

Basic types: Represents the entire group related by common ancestry, including both past (created kinds) and present generations (species). The reproductive discontinuity between basic types; reproductive continuity within a basic type.

Wood [44] provides a list of taxa from the common descent model that is considered “basic types” or “species” according to the common design model.

Origin of life and species model

Approximately 3.8 billion years ago, pi electron resonance clouds in single-chain amphiphile molecules coalesced in geometric pi-stacks, forming viroids with quantum-friendly regions for OR events within Earth’s deep-sea hypothermal vents. [45]

Subsequently, through natural selection and OR events, groups of viroids formed into highly ordered local domains of key biomolecules of a DNA/RNA virus or molecule, which later evolved into different species of unicellular organisms. [46]

Through HRT, these unicellular organisms underwent extensive regulatory switching and rewiring in their noncoding regulatory regions, which led to the divergence of transcription start sites and gene expression levels in the formation of primitive multicellular organisms. This same multicellular toolkit and modules of slime molds then developed into created kinds at different times and global locations (for more details, read Stuart Hammeroff’s description of how microtubules played a part in the origin of species). [5]

This model already has a history of accurate predictions regarding gaps in the fossil record. For instance, the fossil record has revealed that the observed pattern of no evolutionary change punctuated by rapid biological innovations matches the patterns predicted [47] if a common design/archetype accounts for life’s history and diversity.

For instance, God fine-tuned life in just-right forms, at just-right times, and at just-right abundance and diversity levels and replaced those life forms via mass extinction events followed quickly by mass speciation events. The new life forms removed the just-right amounts of carbon dioxide and methane from the atmosphere to perfectly compensate for the brightening Sun. [48]

Without a conscious mind, there would be no possibility of perfectly compensating for the increasing brightness of the Sun so that life can be continuously sustained on Earth throughout the past 3.8 billion years.

This activity would explain how instances of high gene-tree conflict (discordance in phylogenetic signals across genes) in mammals, birds, and several major plant clades correspond to increased rates of morphological innovation. [49]

Origin of species predictions

Based on this theory, we would expect to find:

(A) Over 80% of families and orders evolved separately.

(B) Over 80% of ERVs and pseudogenes are functional.

(C) The regulatory regions of core gene promoters between families and orders are over 80% incongruent with species phylogenies (i.e., vertical inheritance).

See the list of suspected basic types based on preliminary analysis. [44] In addition, read Appendix 2 for an explanation of why these predictions are separate from the common descent model.


We can assess prediction (A) by applying analogous phenotypic traits between families and orders to different environmental niches based on similar needs.

There is a four-question survey where each practical criterion is designated by a letter (A–D) and a title in the form of a question (relating to food, predators, reproduction, and habitat).

For example, if the answer to the question “Is the common feature of this group being used differently in their habitats?” is “No,” “To be determined (TBD)," or “Not applicable (N/A),” a follow-up question is asked: “Do they respond differently in different habitats?” This may require artificially planting them in different habitats for an answer. If the answer to either question is “Yes,” we can start testing whether there are adaptive and structural convergent genes pertaining to the application of this analogous trait. If the test reveals at least one adaptive and/or structural gene, we can confidently conclude a common design.

However, if the answer to both questions is “No” or “TBD,” we must apply the same question formula to prey and/or predator measures to potentially draw a definite conclusion. If the answer is still “No” or “TBD,” then we ask, “Is the common feature of this group being used differently in sexual reproduction?”

The results are inconclusive if every question yields a “No” or “TBD” answer. This method was inspired by a study on red and giant pandas, which concluded that their false thumbs evolved separately in response to similar needs, [50] and a study that showed why and how they evolved separately. [51]

As a test run, we will evaluate the family Equidae to determine whether they are a “basic type,” by using the extensive work that has been done already. For instance, a recent study [52] confirmed the results of earlier studies, showing that all horses are of a single basic type. Most importantly, preliminary results showed evidence that horses were, for the most part, sufficiently different from tapirs and rhinos, which share the trait of odd numbers of toes. [53] However, these two studies mainly give us a good reason to “suspect” all three groups are basic types rather than only one group.


Is the common feature of this group being used differently?
(A) Habitat? Yes
(B) Food? TBD
(C) Reproduction? N/A
(D) Predators? Yes

Horse behavior is best understood from the view that horses are prey animals with a well-developed fight-or-flight response. Their first reaction to a threat is often to flee, although sometimes they stand their ground and defend themselves or their offspring in cases where flight is untenable, such as when a foal is threatened. [54]

Tapirs are strong swimmers who may walk along the bottom of riverbeds to find food. They instinctively escape predation by moving into the water, and they can stay submerged in deep water long enough to make any predators clinging to their backs let go. [55]


According to the ecological study, horses, tapirs, and rhinos use their odd toes differently, pertaining to their habitats and predators. Horses use this trait to run from predators in open terrain, tapirs use it for swimming and avoiding predators in the water, and rhinos use it to charge predators.

Future Research

Preliminary results suggest that Equidae is a legitimate basic type that shares a common design with tapir and rhino based on the odd-toed trait fitting different applications in response to similar needs.
However, this conclusion is tentative because we still need to find evidence for adaptive and structural convergent genes of the odd-toed trait. Future experiments should elucidate these underlying mechanisms and genes. The question that remains is:

“Should evolution be considered a truly random process or a directed one?” and “Do all living organisms share a common ancestor or a common mechanism?”

The common design model views convergence as resulting from a universal common designer who employs a single, optimal solution to address a common set of problems faced by organisms possessing different characteristics and living in different habitats.

Common descent views convergence as occurring when unrelated species encounter identical, or nearly identical, environmental, predatory, and/or competitive selection effects. In other words, common descent suggests that natural selection channels randomly occurring variations in unrelated species toward identical outcomes.

There are two obvious problems with the common descent explanation for convergence. First is the frequency with which it is observed to occur. For instance, a study by biologists demonstrated, at the molecular level, that evolution is both unpredictable and irreversible. [56] The study focused exclusively on the type of evolution known as purifying selection, which favors mutations with no or only a small effect in a fixed environment. This is in contrast to adaptation, in which mutations are selected if they increase an organism’s fitness in a new environment. Purifying selection is by far the more common type of selection.

Consequently, we would expect functional ERVs and pseudogenes to be extremely rare under common descent models. Second, occurrences of convergence where the environmental, predatory, and competitive selection effects would not at all be similar.

In contrast, the common design model predicts that over 80% of taxonomical groups have functional ERVs and pseudogenes because the mechanism the designer uses in the form of HRT naturally produces this effect, unlike natural selection (See the competitive endogenous RNA (ceRNA) hypothesis). [57]

For further research, evolutionary biologists, paleontologists, ecologists, and molecular biologist will need to look for morpho-molecular dissimilarities and/or lack of fossil intermediates among order and family level taxa. Then, they need to use the two-step ecology criteria described above to confidently conclude common design. Moreover, phylogeneticists will need to use bayisesan inferences to find out whether the common design model better fits the data than common descent when it is applied to the list of suspected created kinds and basic types.

Finally, it has been repeatedly found that what initially seemed to be non-functional ERV’s and psuedogenes caused by an unguided process turned out to be functional afterall with increasing understanding of the design [59]. Because a greater understanding revealed that these apparent vestigials still have function at various instances, further research may reveal how many other ERV’s and psuedogenes may actually be functional as well in future.

This type of research should answer these relevant questions and contribute to my theory, such as …
Origin of life: Exactly how, where, and when did life on Earth originate? What were the metabolic pathways used by the earliest life forms?

Origins of viruses: Exactly how and when did different groups of viruses originate?

Last Universal Common Ancestor: What were the characteristics of the Last Universal Common Ancestor of Archaea, Bacteria, and Eukaryotes? Did Archaea and Eukaryotes evolve out of the domain Bacteria or to a clade basal to it? Do Archaea and Eukaryotes share a later or earlier common ancestor to Bacteria?

All that is for the future. I can only say that the possible involvement of a universal common designer in genetic information processing has provided a novel way of analysing these processes.

Theoretical Difficulties

According to the universal common designer theory, God is omniscient and designed animals for the purpose of surviving, reproducing, and/or adapting. For this reason, God cannot design animals with features that hinder or reduce the probability of surviving, reproducing, and/or adapting.

However, there are currently some examples in nature that seem to conflict with the theory, such as the serum response factor, which causes heart failure in humans. In addition, humans cannot breathe and swallow at the same time due to our lowered voice box, which supposedly allows us to create a wider range of sounds.

In the future, we expect to find a sensible purpose for these alleged design flaws, showing that they do not reduce the probability of survival, reproduction, and adaptation but rather increase it.

For example, random bumps along the leading edge of the humpback’s fins seemed to hinder the aquadynamic efficiency of the humpback whale. However, Van Nierop et al. (2008) assembled a detailed model of the humpback whale fin and tested it in a wind tunnel. Their experiments demonstrated two things: first, the bumps are not as big a limiting factor in the whale’s straight-ahead swimming speed as was initially presumed, and second—and more unexpectedly—the bumps provided a major payoff for the whale in maneuverability. The bumps allow the whale to reach a much higher attack angle without losing essential momentum. [57]

Furthermore, God is omnibenevolent and designed animals for the purpose of surviving, reproducing, and/or adapting. For this reason, God will not design animals with pathogens or features that reduce the population or another animal’s ability to survive, reproduce, and fit an environmental niche.

However, there are current examples in nature that seem to conflict with the theory, such as

(i) Toxoplasma gondii and toxoplasmosis
(ii) Excretory or digestive systems of parasitic insects
(iii) Tongue-eating louse (a parasitic isopod)
(iv) Carnivorous behavioral genes of parasitic vertebrates
(v) The enzyme B-1 4 glucanase

In the future, we expect to find a sensible purpose for alleged “evil” designs that show they do not reduce (but increase) the population or another animal’s ability to survive, reproduce, and fit an environmental niche.

For example, in animals, injury can lead to long-lasting distress, whereby frequent exposure to pain-producing stimuli causes a progressively amplified response well after the injury has healed. This phenomenon has been referred to as “nociceptive sensitization.” Biomedical researchers have long viewed nociceptive sensitization as maladaptive because, in humans, it is associated with anxiety. [58]

However, Crook et al. (2014) studied nociceptive sensitizations in squids and concluded that heightened sensitivity to pain helps these creatures evade predation. Squids are an outstanding laboratory model because they undertake a precise sequence of defensive behaviors when threatened by a predator. When endangered, squids that fully recovered from a previous injury reacted sooner than those that had not been injured. Conversely, the previously injured squids exhibited a slower response to predatory threats when the scientists used anesthetic to block the pain immediately after injury, thus preventing nociceptive sensitization. As nociceptive sensitization is pervasive, it likely serves a similar benefit to other animals. Thus, these results indicate that pain (or suffering) plays a key role in enhancing the survival of animals following an injury and recovery.

Examples of design decay or design trade-offs in nature do not conflict with the theory. For instance, God cannot create a universe without decaying effects because the second law of thermodynamics is a feature of quantum mechanics; thus, it must exist in all possible worlds, unlike other laws of nature. [59] Design trade-offs we find in nature are also subject to the law of entropy. For this reason, God would not be held responsible for a genuine design flaw or cruel design feature under these circumstances.

In the future, it will be paramount for scientists to reexamine the remaining claims of design flaws by looking at the organism as a whole, even if it exhibits some features that may be perplexing, rather than make an argument from ignorance or personal incredulity. By encouraging researchers to look at the overall design and purpose of an organism as well as expand and test different taxonomical groups, the aim is to accelerate scientific discoveries and provide more support for this explanatory model.


According to the laws of logic, [60] the attributes of God have to work in accordance with each other in a logically consistent manner because God is who God is (i.e., the law of identity) and cannot simultaneously not be who God is (i.e., the law of non-contradiction).

This means that God cannot make God cease to exist because this would conflict with God being a necessary being. God cannot make a square circle because this would conflict with God’s omniscience. God cannot lie because it would conflict with God’s omnibenevolence. God cannot make a rock so heavy that God cannot lift it because it would conflict with God’s omnipotence.

Most importantly, God cannot create and develop a world that does not have God intimately involved in the process every step of the way because it would conflict with God’s “personal” nature. Thus, God must be true to “all” God’s attributes because to do otherwise would be to deny God’s own self.


[1] Darwin’s greatest discovery: Design without designer | PNAS

[2] Does evolutionary theory need a rethink? | Nature

[3] Transfer of noncoding DNA drives regulatory rewiring in bacteria | PNAS

[4] Evidence of non-random mutation rates suggests an evolutionary risk management strategy | Nature

[5] Hameroff, S., 2017. The quantum origin of life: How the brain evolved to feel good. In On Human Nature (pp. 333-353). Academic Press.

Ch20-9780124201903_aq 1…1 (

[6] How the EES differs from the Modern Synthesis – Extended Evolutionary Synthesis

[7] Owen, R., 1849. On the nature of limbs: a discourse delivered on Friday, February 9, at an Evening Meeting of the Royal Institution of Great Britain. J. van Voorst.

[8] Penrose, R., 1989. The Emperor’s New Mind: Concerning Computers, Minds, and the Laws of Physics. Oxford University Press, Oxford.

[9] [2208.03726] Human Perception as a Phenomenon of Quantization (

[10] Nuclear Spin Attenuates the Anesthetic Potency of Xenon Isotopes in Mice | Anesthesiology | American Society of Anesthesiologists (

[11] Live visualizations of single isolated tubulin protein self-assembly via tunneling current: effect of electromagnetic pumping during spontaneous growth of microtubule | Scientific Reports (

[12] Consciousness in the universe: A review of the ‘Orch OR’ theory - ScienceDirect

[13] Krauss, L.M., 1998. The end of the age problem, and the case for a cosmological constant revisited. The Astrophysical Journal, 501(2), p. 461.


[15] Thompson, R.I., Bechtold, J., Black, J.H., Eisenstein, D., Fan, X., Kennicutt, R.C., Martins, C., Prochaska, J.X. and Shirley, Y.L., 2009. An observational determination of the proton to electron mass ratio in the early universe. The Astrophysical Journal, 703(2), p. 1648.

[16] Natural engineering principles of electron tunnelling in biological oxidation–reduction | Nature

[17] Quantum Uncertainty and Relativity Especially Fine-Tuned for You! - Reasons to Believe

[18] Experimental proof of nonlocal wavefunction collapse for a single particle using homodyne measurements | Nature Communications

Quantum Experiment Verifies Nonlocal Wavefunction Collapse for a Single Particle (

[19] Measurements on the reality of the wavefunction | Nature Physics

Wave function gets real in quantum experiment | New Scientist

Ghosts in the atom: Unmasking the quantum phantom | New Scientist

[20] Phys. Rev. Lett. 126, 230403 (2021) - Relativistic Bell Test within Quantum Reference Frames (

Quantum-nonlocality at all speeds – ScienceDaily

[21] Quantum erasure with causally disconnected choice | PNAS

[22] [quant-ph/0002037] Quantum Algorithms and the Genetic Code

[23] [1908.11213] The Grover search as a naturally occurring phenomenon

[24] Evidence of the accelerated expansion of the Universe from weak lensing tomography with COSMOS | Astronomy & Astrophysics (A&A) (

Astronomers confirm Einstein’s theory of relativity and accelerating cosmic expansion – ScienceDaily

[25] Expanded encyclopaedias of DNA elements in the human and mouse genomes | Nature

[26] The extent of functionality in the human genome - PMC (

[27] De novo mutation rates at the single-mutation resolution in a human HBB gene-region associated with adaptation and genetic disease (

[28] Mutation bias reflects natural selection in Arabidopsis thaliana | Nature

[29] Evolution: are the monkeys’ typewriters rigged? | Royal Society Open Science (

[30] DNA as a logic operator | Nature

[31] Comparing genomes to computer operating systems in terms of the topology and evolution of their regulatory control networks | PNAS

[32] Two genetic codes: Repetitive syntax for active non-coding RNAs; non-repetitive syntax for the DNA archives - PMC (

[33] Grammar of protein domain architectures | PNAS

[34] Why Machine-Information Metaphors are Bad for Science and Science Education | SpringerLink

[35] Leonard M. Adleman, 1998. Computing with DNA. Scientific American, pp. 54–61.

[36] Are viruses our oldest ancestors? | EMBO reports (

[37] Cello, J., Paul, A.V. and Wimmer, E., 2002. Chemical synthesis of poliovirus cDNA: generation of infectious virus in the absence of natural template. Science, 297(5583), pp. 1016-1018.

[38] Tenoever, B.R., 2013. RNA viruses and the host microRNA machinery. Nature Reviews Microbiology, 11(3), pp. 169-180.

[39] Degradation and remobilization of endogenous retroviruses by recombination during the earliest stages of a germ-line invasion | PNAS

[40] Movements of Ancient Human Endogenous Retroviruses Detected in SOX2-Expressing Cells | Journal of Virology (

[41] The algorithmic origins of life | Journal of The Royal Society Interface (

[42] Ten reasons to exclude viruses from the tree of life | Nature Reviews Microbiology

Viruses and the tree of life (

[43] Elder, T. W., 2017. Created Kinds, Baraminology, and the Creation Orchard: On the Origin of Kinds by Special Creation and the Preservation of Mankind by the Creator. Scripture Advocate Publishing

ckbcoGg (

[44] A List and Bibliography of Identified Baramins | Journal of Creation Theology and Science Series B: Life Sciences (

[45] Potato spindle tuber “virus”: IV. A replicating, low molecular weight RNA - ScienceDirect

[46] Hot crenarchaeal viruses reveal deep evolutionary connections | Nature Reviews Microbiology

[47] The million-year wait for macroevolutionary bursts | PNAS


[49] Phylogenomic conflict coincides with rapid morphological innovation | PNAS

[50] Evidence of a false thumb in a fossil carnivore clarifies the evolution of pandas | PNAS

[51] Comparative genomics reveals convergent evolution between the bamboo-eating giant and red pandas | PNAS

[52] NextGen Stats Confirm that All Fossil Horses Belong to the Same Created Kind (

[53] New Baraminological Methods Confirm Monobaraminic Status of the Horses (Perissodactyla: Equidae) and Preliminary Analyses of New Datasets Suggest the Possibility of Discontinuity between Horses and Various Outgroup Taxa (



[56] Shah, P., McCandlish, D.M. and Plotkin, J.B., 2015. Contingency and entrenchment in protein evolution under purifying selection. Proceedings of the National Academy of Sciences, 112(25), pp. E3226-E3235.


[58] Chiu, H.S., Martínez, M.R., Bansal, M., Subramanian, A., Golub, T.R., Yang, X., Sumazin, P. and Califano, A., 2017. High-throughput validation of ceRNA regulatory networks. BMC genomics, 18(1), pp. 1-11.

High-throughput validation of ceRNA regulatory networks | BMC Genomics | Full Text

[59] Overcoming challenges and dogmas to understand the functions of pseudogenes | Nature Reviews Genetics

[60] Phys. Rev. Lett. 100, 054502 (2008) - How Bumps on Whale Flippers Delay Stall: An Aerodynamic Model (

[61] Crook, R.J., Dickson, K., Hanlon, R.T. and Walters, E.T., 2014. Nociceptive sensitization reduces predation risk. Current Biology, 24(10), pp.1121-1125.

Nociceptive Sensitization Reduces Predation Risk: Current Biology

[62] Second law of thermodynamics is ingrained within quantum mechanics - ScienceDirect

Rejection reason #1

1 Like

You wouldn’t reject it for having a results section with no results?

1 Like

Good reason for rejection, but do you really think anyone will read that far before hitting “delete”?

This has been through 10 rounds of scientific editing??


I’d already rejected it before getting to that section.

Definitely. Albeit from some mixture of entertainment, fascination, disbelief and reluctance to work on something else.


Personally I’m not reading an introduction until I know there are results worth caring about.

1 Like

Having methods worth caring about, data worth caring about, and hypotheses worth caring about would probably help, too.

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.