Due to disputing new data currently being reported in multiple fields, a different perception of evolution is beginning to manifest, wherein the processes underlying organismal development are recognized as causes of evolution. New questions are being raised in light of this novel evidence. Should evolution be considered a truly random process or a directed one? Do all living organisms share a common ancestor or a common evolutionary mechanism?
Based on previous studies and available literature, herein, I demonstrate how the quantum mind theory and universal common design/archetype are well-supported theories. I combine the two common explanations for life into one theory called the “universal common designer theory.” Moreover, I provide a model describing the nature of this designer in more detail, which can potentially predict biological phenomena. I then provide a method to test for the existence of a universal common designer in nature by showing that a universal common design defines evolution. Finally, I address several scientific objections revolving around the dysteleological argument that have been relegated against the theory, providing a sound methodology to further test this idea.
Keywords: common design, Objective Reduction,
As Ayala  clarified, evolution by natural selection explains natural designs without a designer. For instance, when scientists use the term “random” to describe mutations, they refer to the unintentional nature of the process; mutations do not “attempt” to supply what the organism “needs” in a given moment or place. Instead, environmental factors influence only the rate, not the course, of mutations. For example, contact with harmful chemicals may increase mutation rates but will not increase beneficial mutations that render an organism resistant to those chemicals. In this sense, mutations are considered random because no “conscious” intent is involved, suggesting that there is no personal agent selecting adaptive combinations in evolution. Ayala further explains how this description of mutations essentially forms the basis of the Modern Synthesis theory, which was proposed between 1936 and 1947, reflecting the consensus on how evolution proceeds. Nineteenth-century evolutionary ideas by Charles Darwin, Gregor Mendel, and others were expanded upon by researchers studying population genetics between 1918 and 1932, who incentivized the Modern Synthesis theory by showcasing that Mendelian genetics were consistent with natural selection and gradualism. 
Nevertheless, disputing new data are currently being reported in multiple fields, and thus, a different perception of evolution is beginning to manifest, wherein the processes by which organisms develop are recognized as causes of evolution.  For instance, analysis of the genomes of 46 sequenced isolates of Escherichia coli provided a statistically supported comparison of the phylogenetic tree topologies for regulatory regions, their regulated genes, and vertical inheritance. The results of this comparison highlight the notion that the evolution of regulatory regions of over half of the core genes (i.e., genes shared by all isolates) was incongruent with that of vertical inheritance. 
Moreover, Martincorena et al.  similarly proposed that observations suggest that the mutation rate has been evolutionarily optimized to reduce the risk of deleterious mutations. However, current knowledge of factors influencing the mutation rate does not explain these observations, suggesting that other mechanisms are likely involved.
New questions are being raised in light of this novel evidence. Should evolution be considered a truly random process or a directed one? Do all living organisms share a common ancestor or a common evolutionary mechanism?
Based on previous studies and available literature, herein, I aim to demonstrate how the quantum mind theory is well-supported and can potentially predict biological phenomena. Moreover, I aim to substantiate the existence of a universal common designer in nature by showing that a universal common design underlies evolution. Previous attempts to reintroduce an intelligent designer into science by intelligent design (ID) theorists have not gained traction for several reasons.
ID theorists argue that the very presence of complex specified information (CSI) found in DNA automatically provides empirical support for the claim that an intelligent designer created and designed life because only human designers can produce CSI based on uniform experience. ID theorists attempt to provide examples of irreducible complexity or specified information in nature to establish that an intelligent designer devised all living organisms. This involves demonstrating how removing one part of a complex design, such as an eye, would cause the entire system to cease functioning. However, human designers are finite and fallible beings that design things based on limited prior knowledge using and modifying preexisting material, which would merely mimic Neo-Darwinian mechanisms. More importantly, ID theorists have not yet attempted to prove or explore the nature of this designer.
Unlike most ID theorists, Roger Penrose and Stuart Hammeroff have explored the nature of this intelligent designer through a gravity-induced self-collapse, which they refer to as an “objective reduction.” They provided a comprehensive model of the nature and mechanism of this conscious agent that can explain the origin and evolution of life, species, and consciousness.  However, their theory does not go on to differentiate this conscious agent from mindless forces. For instance, even if the proposed experiments confirmed Penrose’s prediction, all it would prove is that non-biological settings display elements that mirror conscious behavior. However, this could be considered anthropomorphism.
The Extended Modern Synthesis theory, however, holds two key assumptions : (1) mutations are a random process, and (2) all living organisms share a common ancestor. This article considers both assumptions while primarily focusing on the latter. Here, I show how the universal common archetype/design theory first proposed by Richard Owen, a predecessor of Darwin, can explain and predict how biological processes on Earth developed over time. I then propose an updated version of the Modern Synthesis theory.  Finally, I provide a model describing the nature of this designer in more detail, creating a synthesis between the quantum mind theory and the universal common design/archetype theory.
Quantum mind theory
According to Roger Penrose, the action of consciousness proceeds in a way that cannot be described by algorithmic processes.  For instance, conscious contemplation can ascertain the truth of a statement and freely make intellectual and moral judgments. This involves distinguishing between true and false statements or what is morally “right” versus “wrong.”
The only thing in nature that does this is a wave-function collapse. For instance, at small scales, quantum particles simultaneously exist in the superposition of multiple states or locations, described by a quantum wave function. However, these superpositions are not seen in our everyday world because efforts to measure or observe them seemingly result in their collapse to definite states.  Why quantum superpositions are not seen is a mystery known as the measurement problem, which seems somewhat related to consciousness. Experiments from the early 20th century indicated that conscious observation caused superposition wave functions to collapse to definite states, choosing a particular reality. Consciousness was said to collapse the wave function under this view. 
Moreover, Diederik Aierts  demonstrated how these two phenomena are identical by applying the quantum theory to model cognitive processes, such as information processing by the human brain, language, decision-making, human memory, concepts and conceptual reasoning, human judgment, and perception. Owing to its increasing empirical success, quantum cognition theory has been shown to imply that we have quantum minds.
Other empirical data have shown that the brain is a quantum computer that uses quantum mechanical processes, such as quantum tunneling and superposition, [10, 11] explicitly suggesting that we have quantum minds, as the Orch-OR theory predicted (Read section 4.5 OR and Orch-OR of “Consciousness in the universe” by Hammeroff and Penrose for more details). 
Lastly, observations and experiments on the fine-tuning constants seem to support an aspect of quantum mind theory called the universal proto-consciousness field theory. This field theory has also been referred to, by Penrose, as objective reduction (OR) and incorporated in his Orch-OR model to explain why humans have consciousness and these fine-tuning constants.
For the sake of clarity, quantum mind theory does not advocate for dualism or an additional supernatural force/substance that would operate outside the rules of science. Instead, it advocates for consciousness as an essential ingredient of physical laws that science has not yet fully understood. For more details, please refer to the introduction of “Consciousness in the universe” by Hammeroff and Penrose. 
In the next section, I provide a model describing the nature of this universal proto-consciousness in more detail using fine-tuning constants related to life and advanced life. However, Penrose’s highly speculative gravity-induced collapse model will not be discussed or further developed because it is beyond the scope of this article.
Definition: universal proto-consciousness, the universal self-collapsing wave function.
Universal proto-consciousness field
According to the eternal inflationary theory, the acceleration of the expanding universe is thought to result from an explosion or collision of quantum fluctuations of particles—called the “cosmological constant”—that permeate the entire multi-verse where a billion (plus one) positive particles and a billion negative particles come into existence at once.
The cosmological constant is placed at a precise measurement of 10 to the 120th power.  When scientists calculate the rate of expansion back one second after the Planck scale of the early universe, the value is revealed to be an astounding 10 to the 10 to the 123rd power.
Furthermore, using the Planck scale Wilkinson Microwave Anisotropy Probe (WMAP), researchers have demonstrated that the fine-structure constant in physics has remained fixed over the universe’s history.  For the first time, a team of five physicists led by Nathan Leefer has confirmed the constancy of the fine-structure constant to the entire geographical extent of the universe.
This limits constraints on models of dark energy that invoke rolling scalar fields and curtails the parameter space of supersymmetric or string theory physics models. 
Although the Planck satellite results do not yet rule out all these models of dark energy that show change over time, there is no evidence that the cosmological constant varied, and we now have evidence suggesting it was probably constant throughout time and space.
Hypothetically, this indicates that if our universe’s expansion rate had different values with larger amounts of dark energy, the universe created during the expansion that formed planets and stars, where life of any kind might evolve, would have most likely blown apart the cosmic material instead. If our universe’s expansion rate had different values with smaller amounts of dark energy, the universe created during the expansion would most likely have collapsed back into a singularity before it ever reached its present size.
This sort of fine-tuning plays a role in biology as well. For instance, quantum tunneling needs to be extremely precise for hemoglobin to transport the right amount of oxygen to the cells of all vertebrate and most invertebrate species. 
For instance, if the conscious observer chooses to measure the momentum of a particle with precision, the observer discovers that the position of the particle is now known only approximately ± half a mile (for more details on this, please refer to inference # 5 below). However, according to the Heisenberg principle, the more precisely the position of some particle is determined, the less precisely its momentum can be predicted from initial conditions, and vice versa. If the uncertainty in the position becomes much greater or smaller than half a mile, hemoglobin will not function as it does, rendering advanced life impossible. 
This means that, despite the Heisenberg principle being a random process, the uncertainty in the Heisenberg uncertainty principle must be fine-tuned. Overall, if we combine all the evidence, this would reveal that the universal proto-consciousness has the ability to do these things:
- Is causally active everywhere (i.e., omnipresent).
There are two types of causal interactions in nature: antecedent causation and simultaneous causation. Antecedent causation is where every event precedes another in time because they are material causes that operate under the principles of classical physics.
In contrast, simultaneous causation occurs where the cause exists with the effect inside the same event because they are non-local causes that operate under the principles of quantum physics. We have experimental proof that the “choice of measurement in one lab really causes a change in the local quantum state in the other lab.” 
- Knows the position and momentum of every particle in the universe at a particular moment in time according to the universal wave function (i.e., omniscience).
The universal wave function is the “basic physical entity” representing the “totality of existence” or “the fundamental entity, obeying at all times a deterministic wave equation.” Each wave function- is a mathematical configuration of matter or universe and has been confirmed to exist as an objective part of reality. 
Furthermore, according to the superposition principle in quantum mechanics, wave functions can be added together and multiplied by complex numbers to form new wave functions. The Schrödinger equation determines how wave functions evolve over time, and a wave function behaves qualitatively like other waves, such as water waves or waves on a string, because the Schrödinger equation is mathematically a type of wave equation, explaining the name “wave function.”
- Has the power to accelerate our universe forever (i.e., omnipotent)
If the topology of the universe is open or flat, or if dark energy is a positive cosmological constant (both consistent with current data), the universe will continue expanding forever.
Moreover, the scientific definition of “Power” is the rate at which energy is transferred, used, or transformed; it is not synonymous with energy. Therefore, the cosmological constant or vacuum energy is not an infinite amount of energy; instead, it is a very precise cancellation effect between negative and positive energy particles or fluctuations.
- Does not have defined locations in space and time (i.e., eternal);
Things that do not have defined locations in space and time do not exist in classical space and time according to the classical definition of space-time. Thus, as classical space and time have been shown to be interwoven concepts and matter/energy cannot exist without space-time, something that is considered tenseless would have to be understood as “non-local,” which is not classical space-time involving general relativity.
This is referred to as quantum non-locality, where each particle moves in a so-called superposition of different velocities simultaneously. This has been confirmed to be a universal property of the world, regardless of how and at what speed quantum particles move. 
- Intended for advanced life to survive, reproduce, and pioneer different environments (i.e., personal)
Quantum physics experiments have revealed no concrete physical reality made of classical space-time constituents. Instead, the so-called material realm actually exists in a super-positional state of all quantum possibilities (i.e., wave functions) that are mathematical in nature.
The intangible phenomenon of conscious observership is the only means capable of “collapsing” any given combination of quantum wave functions, which imparts a concrete and physical reality to them: “No naive realistic picture is compatible with our results because whether a quantum could be seen as showing particle-like or wave-like behavior would depend on a causally disconnected choice.” 
This shows how the right fine-tuning values were carefully chosen to allow advanced life to exist from the beginning leading up to the present.
- Intends all possible kinds of life to potentially survive, reproduce, and pioneer different universes (i.e., omnibenevolent)
Every living creature on Earth uses the same code, in which DNA stores information using four nucleotide bases. The sequences of nucleotides encode information for constructing proteins from an alphabet of 20 amino acids. Why were these numbers chosen rather than some other numbers?
Patel showed how quantum search algorithms explain why these numbers were chosen.  To summarize, if the search processes involved in assembling DNA and proteins are to be as efficient as possible, the number of bases should be four, and the number of amino acids should be 20.
In other words, to address a common set of problems faced by organisms possessing different characteristics and living in different habitats, a single optimal solution must be employed. This means that if we replay the evolutionary history of life, it would lead to identical or nearly identical outcomes. An experiment has revealed that this quantum search algorithm is itself a fundamental property of nature. 
- Must exist in all possible worlds or universes to create and sustain life (i.e., necessary).
Observations show that dark energy is consistent with general relativity and affects the expansion of the universe,  suggesting that the cosmologic constant applies to the smaller universes created within the regions of our universe as well because general relativity predicted and confirmed the existence of gravitational waves, which supports the eternal inflationary theory.
Empirical support for model
Support for this model has come from confirmed predictions that show non-random mutations. For instance, as Mattick and Dinger indicated, it has long been argued that the presence of non-protein coding or so-called ‘junk DNA’ that comprises more than 90% of the human genome is evidence for the accumulation of evolutionary debris by blind Darwinian evolution, which would argue against intelligent design, as an intelligent designer would presumably not fill the human genetic instruction set with meaningless information.
This objection has been essentially refuted in the face of growing functional indices of noncoding regions of the genome, with the latter reciprocally used in support of the notion of ID and to challenge the conception that natural selection accounts for the existence of complex organisms. In fact, it has been shown that well over 80% of junk DNA is functional.  However, there is controversy surrounding these results. I will direct readers to an article that addresses and responds to the objections and controversy. 
The vast majority of mutations in regions that do encode proteins are deleterious and prevent beneficial mutations from being fixed within the population. Nonetheless, in a study on 34 E. coli strains, Martincorena, Seshasayee, & Luscombe  discovered that the mutation frequency varies across bacterial genomes. Some regional “hot spots” have a reasonably high mutation rate, whereas “cold spots” display a reasonably low rate of genetic change. The researchers discovered that the hot- and cold-spot locations are not random.  Thus, it appears that the mutation rates have been fine-tuned to lower the risk of harmful genetic changes. Recent studies have drawn the same conclusion. [27, 28]
A similar study suggested that mutations are guided by both the physical properties of the genetic code and the need to preserve critical protein function.  The authors analyzed mitochondrial genomes in several species and found numerous positively selected sites where DNA changes allowed the species to adapt to its environment. They hypothesized that the cell might make mistakes copying repeated sequences during DNA replication. While the cell is “fixing the buttons,” the DNA has more time to mutate.
These repeats influence the mutation rate, as mutations in the repeats on either side of the mutated DNA would abrogate protein function, preventing them from being eliminated and resulting in a mutational ‘hot spot’ between stable DNA sequences. Furthermore, the authors found that 97% of the sites were under positive selection, and 60% of all mutated sites were near repetitive sequences. 
Thus, all this evidence suggests that the mutation rates have been fine-tuned to lower the risk of harmful genetic changes. This shows a direct correspondence with the model about the creator using fine-tuning principles of electron tunneling in the construction of advanced life.  In the next section, I will highlight more observations that further reveal the nature of this being.
Universal common designer theory
DNA contains two types of information: digital, represented by genetic information, and analog, represented by the genome; both are present in DNA and have many properties identical to those in man-made computers and linguistic texts.
For example, logic gates convert an input signal into an output, according to modern computers. Typically, these outputs are AND, OR, and YES. Using the output of one logic gate as the input to another, computer experts can develop sophisticated networks of gates to compute answers to complex questions.
Researchers noticed that the beginning of DNA replication also operated as a logic gate at the molecular level. They were able to modify the biochemical systems involved in DNA replication to input AND, OR, and YES logic gates that could respond to chemical inputs provided by scientists in a laboratory setting.  Another example is the genome of the bacterium E. coli and how it functions like the Linux computer operating system (OS), as revealed by other researchers. 
Moreover, the information content of proteins bears almost the same mathematical structure as human language. This involves discrete mathematics or statements of logic and language that humans use to communicate with each other every day, such as phrases, signs, and symbols that are meaningful and personal. [32, 33]
Of course, some scientists insist that the comparisons between DNA information and human information are merely used in a metaphorical sense.  This is contradicted by biotechnologists, who have shown that the protein machines that operate on DNA during processes, such as transcription, replication, and repair, operate very similarly to a computer system. In fact, the similarity is so remarkable that this insight has spawned a new area of nanotechnology called DNA computing. 
Considering these additional data, this universal proto-consciousness probably operated in a similar manner to humans when designing life on Earth. This means that we would not have to worry about using an unfalsifiable theory that involves an omnipotent human because a non-computable being cannot violate his own nature (see Appendix on why God cannot violate his own nature).
In other words, the non-computable trait that this designer possesses offsets the omnipotent trait that this designer would also have to possess if this is true. For this reason, we would expect God’s human nature to be consistent without the flaws that humans naturally have because of their inherent physical limitations.
This allows us to treat an omnipotent God in the same way we would treat other intelligent agents (Neanderthals, modern humans, extraterrestrial intelligence, etc.) when we want to use a valid cause to explain a biological phenomenon over a mindless force.
Thus, all candidates are considered natural but immaterial causes that we can test because consciousness is supposed to be fundamental physics not classical physics.
If this theory is true, we would expect all extant organisms to have a common design that can be traced back to this universal common designer.
To be continued…