Consensus should determine what's taught in science classes. Why?

All that shows is that these kinases perform a useful role in singl-celled organisms. That isn’t front-loading. It’s the cooption of functional proteins for new purposes.

None of that is responsive. I didn’t ask about species, so the whole first paragraph is pointless. And are you really claiming that Caucasians and Asians are separate species?

And of course “all of mankind” can’t be a basic type under the definition you have provided. Humans have changed over time, do have primitive ancestors, and are related by common ancestry to other apes, other monkeys, other primates, other mammals, and so on. This too is not a response to my question; though it does mention a particular basic type, it makes no attempt to show how you know it’s a basic type. Multiple fail.

1 Like

If you can find The Design Matrix on Amazon and check the reviews, you should see mine.

1 Like

For example, " some nonvascular plants could have seeds or flowers, like vascular plants, but they do not. Gymnosperms (e.g. conifers or pines) occasionally could be found with flowers, but they never are. Non-seed plants, like ferns, could be found with woody stems; however, only some angiosperms have woody stems. Conceivably, some birds could have mammary glands or hair; some mammals could have feathers (they are an excellent means of insulation).

Certain fish or amphibians could have differentiated or cusped teeth, but these are only characteristics of mammals. Certain fish or amphibians could have differentiated or cusped teeth, but these are only characteristics of mammals. A mix and match of characters like this would make it extremely difficult to objectively organize species into nested hierarchies. " - by Joe G

Again, it depends. What number of convergent examples is feasible enough to be considered validation for the common design hypothesis rather than common descent?

Without providing a number on your end, I don’t see the point of researching the number of cases on my end.

Bayesian inference and analysis.

This is not true. We do observe this mechanism:

Major viral impact on the functioning of benthic deep-sea ecosystems | Nature

Viruses manipulate the marine environment | Nature

Mike Gene’s claim is that the initial designed state of unicellular organisms contained all the genes needed to develop multicellular organisms according to observations. Given the genetic code itself was designed to minimize deleterious mutations, it seems likely that it was also designed to exploit the evolutionary potential of C-T transitions.

So the bias toward C-T transitions would just be used to exploit this front-loaded state and “unpack” buried designs. As a result, certain evolutionary trajectories were made more likely for the formation of basic types.

This is almost exactly what the study suggested.

Yes, I get that but habitat and environment are not synonymous terms and thus asking such a question is a misnomer.

Habitat is the natural home of a plant, animal, or another organism.
The environment is the state in which the organic, inorganic, and cultural elements interact to protect the birth, growth, existence, etc. of an organism.

As I explained before, the ENCODE project revealed that most, if not all, of these non-coding regions play an important role in the accurate functioning of the DNA,. Ever since the ENCODE results, there have been many more examples such as these, revealing specific functions for non-coding DNA:

Thus, your objection still does not hold any water.

Another study I gave you would beg to differ:

" The scientists found that the way DNA was wrapped around different types of proteins was a good predictor of whether a gene would mutate or not. “It means we can predict which genes are more likely to mutate than others and it gives us a good idea of what’s going on,” Weigel said.

The findings add a surprising twist to Charles Darwin’s theory of evolution by natural selection because it reveals that the plant has evolved to protect its genes from mutation to ensure survival."

Study challenges evolutionary theory that DNA mutations are random: Findings could lead to advances in plant breeding, human genetics – ScienceDaily

I am not following you here. Why do they need to do this in order to prove mutations are non-random?

It is not part of the definition of random

Co-option still involves front-loading according to Mike:

“Cooption, the process by which traits switch function, is something we predict to be important from the hypothesis of front-loading evolution. The Design Matrix lays out a step-by-step case for the logic of front-loading that leads to the realization that cooption is entailed by front-loading. Functional shifts are the very strategy that would work in an attempt to design the future through the present.”

…Yet there is a simpler way to help people understand that cooption is, at the very least, a process that fits very comfortably within a teleological framework. It is the simple fact that cooption is tightly linked to preadaptation. Stephen Jay Gould sought to replace the word ‘preadaptation’ with the word ‘exaptation,’ where an exaptation is a character that retains its ancestral form while taking on a new function. And the process by which the trait switches function is called cooption.

The concept of preadaptation has been recognized by many to possess distinct teleological connotations, which is why non-teleologists have sought to replace it".

"…To see why it is that front-loading predicts cooption, simply consider front-loading as a state where the original cells were endowed with a set of preadaptations that would channel and guide subsequent evolution.

The hypothesis of front-loading evolution would thus predict that significant transitions in evolution would depend on preadaptation."

Consider the abstract for this paper. Emphasis is added:

Evolution. 2010 May;64(5):1189-201.
The importance of preadapted genomes in the origin of the animal bodyplans and the Cambrian explosion.
Marshall CR, Valentine JW.

The genomes of taxa whose stem lineages branched early in metazoan history, and of allied protistan groups, provide a tantalizing outline of the morphological and genomic changes that accompanied the origin and early diversifications of animals. Genome comparisons show that the early clades increasingly contain genes that mediate development of complex features only seen in later metazoan branches. Peak additions of protein-coding regulatory genes occurred deep in the metazoan tree, evidently within stem groups of metazoans and eumetazoans. However, the bodyplans of these early-branching clades are relatively simple. The existence of major elements of the bilaterian developmental toolkit in these simpler organisms implies that these components evolved for functions other than the production of complex morphology, preadapting the genome for the morphological differentiation that occurred higher in metazoan phylogeny. Stem lineages of the bilaterian phyla apparently required few additional genes beyond their diploblastic ancestors. As disparate bodyplans appeared and diversified during the Cambrian explosion, increasing complexity was accommodated largely through changes in cis-regulatory networks, accompanied by some additional gene novelties. Subsequently, protein-coding genic richness appears to have essentially plateaued. Some genomic evidence suggests that similar stages of genomic evolution may have accompanied the rise of land plants.

I will just allow creationists to address your question:

"Since most groups have been studied with only one analysis, most of these monobaramins and holobaramins should be considered tentative. In other cases, multiple studies using different methodologies seem to be converging on a single answer, as in the case of horses, which have been analyzed using statistical methods (Cavanaugh et al. 2003) and by hybridization (Stein-Cadenbach 1993).

In other instances, multiple statistical analyses of different taxon or character samples are converging on a consistent answer. For example, the holobaramin has been evaluated multiple times with different samples of fossils and characters, and the consistent result supports recognizing genus Homo (+ Australopithecus sediba) as the human holobaram (Wood 2010, 2016a; O’Micks 2016).

Especially interesting is the case of Australopithecus sediba, which was placed in the human holobaramin based on baraminological analysis by Wood (2010), and recent phylogenetic analysis by Dembo et al. (2015) confirms that classification. With regard to the majority of baraminology studies, though, the humans and horses are an exception. Most do not have confirmation from multiple studies ."

No, what he says there is that front-loading involves cooption. The problem with using that as evidence of front-loading is that non-front-loading also involves cooption. A prediction that doesn’t differ between hypotheses is not useful in telling the hypotheses apart.

Note, by the way, that this is all about common descent. Front-loading is a hypothesis that begins with common descent. But you deny common descent. Puzzling.

But the quote that follows doesn’t address my question. As usual. I have to question whether you understand what questions and answers are.

1 Like

Another ‘quote’ from @Meerkat_SK5, and as usual there are issues with it.

For starters, anyone using Joe G as an authority needs their head examined. But that’s not really relevant in this case, because Joe G didn’t write that - Doug Theobald did, and Joe G copied it. Unlike @Meerkat_SK5, Joe G did in fact acknowledge that Theobald was the author, not himself, so @Meerkat_SK5 has no excuse for not doing so.

@Meerkat_SK5 has also been messing with the text. It’s particularly obvious in this case because @Meerkat_SK5 has duplicated the first sentence, while Joe G did not.

So we’re left once again with just a mined, mangled, miscited ‘quote’. There’s nothing to indicate that @Meerkat_SK5 has even read the article he’s ‘quoting’, let alone understood it, so no further response is necessary.

2 Likes

What do you mean? What is considered non-frontloading to you?

No, I deny universal common descent and random mutations, remember…
Non-random mutations would still produce limited common descent after the formation of basic types.
I also deny that natural selection played a role in producing basic types because cytosine deanimation plays the primary role or mechanism.

NO, I did. You said, Pick any basic and then explain how you know it’s a basic type.
As in the case of horses, which have been analyzed using statistical methods (Cavanaugh et al. 2003) and by hybridization (Stein-Cadenbach 1993), we can conclude they are a basic type.

By the way, I have been meaning to ask you… Which type of convergence would you say natural selection does not predict or explain?

Evolutionary biologists recognize five different types of biochemical convergence.

“1. Functional convergence describes the independent origin on more than one occasion of biochemical functionality.
2. Mechanistic convergence refers to the multiple independent emergences of biochemical processes that use the same chemical mechanisms.
3. Structural convergence results when two or more biomolecules adopt independently the same three-dimensional structure.
4. Sequence convergence occurs when either proteins or regions of DNA arise separately, yet have identical amino acid or nucleotide sequences, respectively.
5. Systemic convergence describes the independent emergence of identical biochemical systems.”

PII: 0968-0004(94)90167-8 (uu.nl)

Enzyme Convergence Taxes Evolutionary Paradigm - Reasons to Believe

Why would it be a problem if we couldn’t fit species into an objective nested hierarchy? Humans regularly mix and match genes from very divergent species when they genetically modify different organisms. Why couldn’t a different designer do the same?

Depends on what? What is the frequency of convergence? Is it extremely rare, or is it common?

We would only expect rare instances of convergent sequence with common descent.

Statistical tests do not describe the model. What model of intelligent design is Ewert using?

Viruses aren’t producing genomes that contain all of the genes found in all eukaryotes.

Based on what evidence?

Based on what evidence?

And there are fish species and octopus species that share the same habitats and environments. So why do they have different eyes?

The ENCODE project demonstrated no such thing. Just because a stretch of DNA does something does not mean it has function. The ENCODE project tried to conflate “does something” with “function”, and that simply isn’t correct as has been pointed out by many scientists to this day.

That doesn’t differ from what I said. It says right in the quote “more likely to mutate”. This means that it is a probability. It doesn’t guarantee that a specific base will mutate during a specific replication.

Random mutations do not stop being random mutations because fewer of them happen. If I buy 100 lottery tickets and you buy 1 lottery ticket it doesn’t make the lottery nonrandom.

Mike asserts it involves front-loading with no evidence to back it up. Why wouldn’t we see beneficial interactions between genes and mutations without front loading?

2 Likes

That reminds me. I had some exchange of comments with Mike Gene about his book, The Design Matrix, initially at Telic Thoughts, then at a blog he set up to promote his book. I made the point that his book was very thin on evidence to support his idea. Gene assured me he was working on a sequel and would include the supporting evidence. I don’t think it has been published as yet.

1 Like

Not surprising. It really is nothing more than the Sharpshooter fallacy. Using the same logic, I could claim the lottery is front loaded so that you will get a specific winner. To prove my point, I wait for someone to win and then claim it was front loaded. That’s essentially what Mike is doing.

1 Like

Non-frontloading is anything that isn’t frontloading. Frontloading supposes that the ancestors of some taxa were created with genes that would be useful to those taxa in the future, with the explicit intention of having them become useful. (Note that this requires common descent of such taxa, which you deny.) Note that this requires prior intention, which preadaptation and exaptation and cooption do not.

You deny common descent of the species you use to describe front-loading, i.e. that single-celled eukaryotes share descent with multicellular eukaryotes. You even deny common descent of animals. Don’t be coy.

That’s pretty crazy, since families (which you generally identify with basic types) do not differ primarily by C-T transitions. Anyway, cytosine deamination can’t be involved in separate creation of basic types, only in their subsequent evolution.

That isn’t an explanation. It isn’t even a complete citation.

I’m not sure I understand what those different types even mean.

1 Like

From Theobald:

" Unlike organisms, cars do have a mix and match of characters, and this is precisely why a nested hierarchy does not flow naturally from classification of cars.

In fact, it is possible to have a “reciprocal” pattern from nested hierarchies. Mathematically, a nested hierarchy is the result of specific correlations between certain characters of organisms. When evolutionary rates are fast, characters become randomly distributed with respect to one another, and the correlations are weakened. However, the characters can also be anti-correlated—it is possible for them to be correlated in the opposite direction from what produces nested hierarchies (Archie 1989; Faith and Cranston 1991; Hillis 1991; Hillis and Huelsenbeck 1992; Klassen et al. 1991). The observation of such an anti-correlated pattern would be a strong falsification of common descent, regardless of evolutionary rates."

It is very frequent according to the common design model.

Which type of convergence would you say natural selection does not predict or explain?

Evolutionary biologists recognize five different types of biochemical convergence.

“1. Functional convergence describes the independent origin on more than one occasion of biochemical functionality.
2. Mechanistic convergence refers to the multiple independent emergences of biochemical processes that use the same chemical mechanisms.
3. Structural convergence results when two or more biomolecules adopt independently the same three-dimensional structure.
4. Sequence convergence occurs when either proteins or regions of DNA arise separately, yet have identical amino acid or nucleotide sequences, respectively.
5. Systemic convergence describes the independent emergence of identical biochemical systems.”

PII: 0968-0004(94)90167-8 (uu.nl)

Enzyme Convergence Taxes Evolutionary Paradigm - Reasons to Believe

You mean what theory of intelligent design is he using as a basis for his model. I don’t know. I am guessing it is the one that traditional ID theorists have promoted in public, which is…

“certain features of the universe and of living things are best explained by an intelligent cause, not an undirected process such as natural selection.”

That’s not the hypothesis. some viruses contained all the required genes to make certain evolutionary trajectories more likely, thus forming the basic types.

Through natural selection, these viruses evolved into different unicellular species in the deep-sea hypothermal vents of the primitive earth, undergoing an extensive amount of HGT.

Here is the evidence for this:

Evolution. 2010 May;64(5):1189-201.
The importance of preadapted genomes in the origin of the animal bodyplans and the Cambrian explosion.
Marshall CR, Valentine JW.

The genomes of taxa whose stem lineages branched early in metazoan history, and of allied protistan groups, provide a tantalizing outline of the morphological and genomic changes that accompanied the origin and early diversifications of animals. Genome comparisons show that the early clades increasingly contain genes that mediate development of complex features only seen in later metazoan branches. Peak additions of protein-coding regulatory genes occurred deep in the metazoan tree, evidently within stem groups of metazoans and eumetazoans. However, the bodyplans of these early-branching clades are relatively simple. The existence of major elements of the bilaterian developmental toolkit in these simpler organisms implies that these components evolved for functions other than the production of complex morphology, preadapting the genome for the morphological differentiation that occurred higher in metazoan phylogeny.

"Mobile genetic elements, including transposons, plasmids, bacteriophage and self-splicing molecular parasites, have played a crucial role in facilitating the movement of genetic material between organisms [7,8]. These elements likely already played a similar role in the early stages of life’s evolution [9], and continue to play a role even in multicellular eukaryotes.

… Even a universal code must have progressed through evolutionary stages of increasing complexity. The presence of HGT early in the evolution of life before the time of LUCA is also supported by the optimality of the genetic code itself, which likely depended upon extensive HGT to become established [30]."

Ancient horizontal gene transfer and the last common ancestors | BMC Ecology and Evolution | Full Text (biomedcentral.com)

"Our results immediately suggest solutions to three of the puzzles posed by the mosaic genome of birds and mammals ([Bernardi et al. 1985](javascript:;)). The first puzzle is why closely related genes on different chromosomes should often have dramatically different GC contents ([Li and Graur 1991](javascript:;)). The answer is that cytosine deamination and GC content form a positive feedback loop, such that an increase (or decrease) in GC content causes the mutation pressure to shift to a proportionately higher (or lower) GC bias (see [eqs. 2–8](javascript:;)).

All of the elements of this positive feedback loop are well established (C→T transitions affect the GC content, GC content affects DNA melting, DNA melting is rate-limiting for cytosine deamination, and cytosine deamination causes C→T transitions)."

Cytosine Deamination Plays a Primary Role in the Evolution of Mammalian Isochores | Molecular Biology and Evolution | Oxford Academic (oup.com)

Because fish species are generally considered prey so their eyes are going to be designed to avoid predators. In contrast, octopus are generally predators so their eyes would generally be designed to find and eat prey.

"The retinae of fish from different environments have specializations in the ganglion cell layer of the retina. Eyes of deep-sea fish often have a tubular shape that permits a large lens in a relatively small eye.

In the copepod crustaceans there are a number of examples of compound lenses that use multiple elements and aspheric surfaces, instead of a single inhomogeneous sphere."

4 Aquatic eyes: the evolution of the lens | Animal Eyes | Oxford Academic (oup.com)

Are you saying that transcription binding is not necessary for gene regulation?

If you are, then that simply isn’t correct as has been pointed out by many scientists to this day, such as Mattick and Dinger:

“Assertions that the observed transcription represents random noise…is more opinion than fact and difficult to reconcile with the exquisite precision of differential cell- and tissue-specific transcription in human cells.”

Here is what they were talking about:

“Most data acquisition in the project thus far has taken the biochemical approach, using evidence of cellular or enzymatic processes acting on a DNA segment to help predict different classes of functional elements. The recently completed phase of ENCODE applied a wide range of biochemical assays at a genome-wide scale to study multiple human cell types”

As suggested, they were able to find function based on that definition. Case closed!!!

Science does not deal with guarantees and absolute certainties and this is moving the goalpost fallacy. Ayala’s definition of random that YOU accepted is…

"(ii) there is no way of knowing which gene will mutate in a particular cell or in a particular individual.

However, the meaning of “random” that is most significant for understanding the evolutionary process is (iii) that mutations are unoriented with respect to adaptation; they occur independently of whether or not they are beneficial or harmful to the organisms."

The study I gave you and others disprove both assumptions.

You just addressed your own previous objection that frontloading does not add any useful value because the predictions are essentially the same. As you just suggested, they are not because frontloading incorporates teleology, which can be used to make meaningful predictions as Mike Gene has suggested:

"In 2001, Poole, Penny, and Sjoberg published a paper in Nat Rev Mol Cell Biol entitled, “Confounded cytosine! Tinkering and the evolution of DNA.” In the abstract of this paper, they make the following assertion: “Early in the history of DNA, thymine replaced uracil, thus solving a short-term problem for storing genetic information-mutation of cytosine to uracil through deamination. Any engineer would have replaced cytosine, but evolution is a tinkerer not an engineer.

Is it really true that “any engineer would have replaced cytosine? Poole, Penny, and Sjoberg are effectively arguing that because of its propensity to mutate through deamination, there is no rational reason for using cytosine as a base and it exists only as a “frozen accident.” In other words, this aspect of cytosine is being used, at least in part, as an anti-design argument.

What I did is to draw from a teleological perspective to make a prediction. The prediction was simply this: Given its propensity to being damaged, there must be a reason cytosine was included as one of the four bases of DNA/RNA. This prediction is entailed by the teleological hypothesis that life is ultimately rational; if life was designed, then there is a reason behind its architecture and composition.*

So it’s not that front-loading predicts cytosine would be used as a base. It’s that a hypothesis of life’s design predicts there would be a reason cytosine was used as a base.

This prediction then provided the impetus to take a closer look at the relationship between the genetic code and cytosine deamination. And as a result, I uncovered a pattern that no one else (AFAIK) has seen – a reason to include cytosine as a base. A copy of my original description is posted here and I provide a more polished account in my book.

Let me summarize. I don’t think front-loading predicts cytosine would be incorporated into the DNA. I predicted that there would be a reason for including cytosine, and its propensity to deaminate, in response to Poole et al.’s assertion that “Any engineer would have replaced cytosine.” That assertion has been refuted. They asserted that no engineer would have used cytosine as part of the genetic material because of its predisposition for deamination. But it’s exactly this predisposition that might cause an engineer of evolution to include it.

*It is worth noting that non-teleologists, as a whole, agree with this logic. For example, when they (and I) cite useless junk DNA as an argument against design, they (and I) are drawing from this logic."

No, what I deny is that common descent is the only explanation for the data. Instead, it is shared mechanism that emanates from the logic of front-loading that was laid out in The Design Matrix:

Front-loading is the investment of a significant amount of information at the initial stage of evolution (the first life forms) whereby this information shapes and constrains subsequent evolution through its dissipation. This is not to say that every aspect of evolution is pre-programmed and determined. It merely means that life was built to evolve with tendencies as a consequence of carefully chosen initial states in combination with the way evolution works.

[…]

Front-loading, by definition, is about designing the future through the present. It is about imposing some kind of constraint on evolution, or more simply put, it is using evolution to carry out design objectives. Since evolution would proceed outward from the originally designed cells, evolution may have been endowed with various sequences and structures to increase the odds that certain future states would be found through a random search stemming outwards from this front-loaded state.

As Mike Gene explains further, " the idea is that this original state would both constrain and facilitate evolution and deep homology, at work behind the scenes in examples of convergence, represents evidence that my hypothesis of designing evolution is indeed feasible and a serious possibility. In fact, a paper published a few weeks ago in Trends in Genetics will sound very familiar to readers of this blog and the DM":

The multiple origins of a trait represent exceptional replicates of evolutionary processes and can provide extremely valuable insights into the constraints and opportunities that govern evolution. In particular, comparing the genetic determinants of the independent origins of an adaptive phenotype can shed new light on the role of genomic background in restricting or opening new evolutionary trajectories towards adaptive innovations. In this paper we discuss the potential causes of convergence at the genetic level together with their implications for our understanding of evolutionary biology in general.

In fact, the authors take this dynamic seriously enough to give it its own name:

studies have traced phenotypic convergence to modifications of homologous genes; in this paper such phenomena will be further referred to as convergent recruitment (Glossary).

And their glossary defines it as follows:

Convergent recruitment: the process of homologous gene becoming recurrently responsible for a novel function.

Yep, convergent recruitment is exactly the type of phenomenon front-loading predicts, rendering a plausible hypothesis even more strongly plausible."

Pascal-Antoine Christin, Daniel M. Weinreich and Guillaume Besnard. 2010. Causes and evolutionary significance of genetic convergence. Trends in Genetics 26 (2010) 400–405

I never suggested that Cytosine deamination was a mechanism for how God separately created basic types, but a mechanism for creating them regardless of whether it was separate or not. With that said, there are other mechanisms that are at play potentially for this process, as I suggested above.

Well, that is just ironic. How am I supposed to provide a competing model if you don’t even know the details of your model?

Better yet, why should I bother giving you an explanation for basic types on my end?

Bottom line: If you are going to claim that common descent is a better model, then you need to understand and explain your position in detail. I gave you the sources on convergence so go read them and come back to me with an explanation. Otherwise, common descent is not a useful model and thus should be discarded in favor of common design.

How does this explain why a lack of an objective nested hierarchy among separately created kinds would be a problem?

You need to show that convergence of sequence is common in reality, not in the model.

So what is the frequency of molecular convergence in reality?

Widespread convergence of sequence would not be predicted or explained by natural selection, especially in sequences like those in introns.

Diverging sequence can land on the same functions, so those types of convergences can be produced by natural selection. For example, bat and bird forelimbs have both found function as wings, but they quite obviously took different evolutionary pathways to get their, and their DNA is divergent.

According to Ewert. those models predict widespread violations of a nested hierarchy.

So the basic types share common ancestry in this model?

Can you predict how human genes are going to evolve using this model?

Where is the evidence that those preadaptions were put there by a designer?

How did we get such large variation and biodiversity if these genes were front loaded for a specific pathway?

That wouldn’t create a species with all of the genes we see currently in the human genome.

Explain this evidence in your own words.

Octopus are never prey, and fish species are never predators? Really?

What type of eye do you think that shark has?

I am saying that transcription factor binding does not always result in gene regulation. Even in random sequence there will be sequence motifs that bind transcription factors, and it has nothing to do with gene regulation. There is nothing stopping transcription factors from binding to non-functional DNA.

It’s easy to reconcile low level differential transcription of junk DNA in different tissues. Different tissues will express different transcription factors, and those different transcription factors will bind to different regions of junk DNA and drive the transcription of different regions of junk DNA.

There is every reason to expect transcription of junk DNA because there is not a strong enough selective pressure to drive either the precision of the transcription factor to only real gene promoters or the removal of transcription factor binding sites in junk DNA.

Their definition is “does something”. That’s not function.

No, it doesn’t. In the study you cite they can’t know which replication will produce a mutation in the regions they are pointing to. The mutations are also random with respect to fitness. The same process produces neutral and detrimental mutations.

Amid another batch of mangled quote mines, which I have ignored on the grounds that @Meerkat_SK5 is unlikely to have read what he is ‘quoting’ from, I spotted this:

Fish species are not generally considered to be prey. Most fish are predators and/or scavengers, though some herbivorous fish inhabit shallow waters and coral reefs. Even small fish are mostly predators, having diets of zooplankton.

Many fish are not just predators, but notorious predators. Sharks, for example. Anglers. Swordfish. Piranhas. Barracuda. Ask some-one to name a marine predator, and they’re far more likely to mention a great white shark than an octopus. These fish’s eyes resemble the eyes of other fish, not those of octopuses. If octopus eyes are designed to find and eat prey, why aren’t shark eyes similarly designed? Why would shark eyes be designed to avoid predators?

All deep-sea fish are predator-scavengers, simply because the lack of food in the deep ocean leads to animals having to eat anything and everything that becomes available. There are no herbivorous deep-sea fish, because there are no plants there. The tubular eyes of the barreleyes and telescope fish referred to generally point upwards, towards prey, rather than downwards towards where plants would be if there were any. Their eyes are optimised for low light, not for spotting threats.

Most deep-sea fish have features obviously suitable for predation, noticeably their fearsome teeth. If fish like these were designed, why were they not given predatory eyes to go with their predatory teeth?

fish3
fish2
fish1

P.S. There’s some lovely pictures as well as information on deep-sea diets here.

2 Likes

If someone found a Precambrian rabbit fossil, it has been said that this would falsify Common descent because it would conflict with the order of appearance.

The same reasoning can be applied to nested patterns. According to this source, "All a scientist has to do is find a life form that does not fit the hierarchical scheme in proper order. We can reasonably expect that yeasts will not secrete maple syrup. This model allows us the logical basis to predict that reptiles will not have mammary-like glands. Plants won’t grow eyes or other animal-like organs. Crocs won’t grow beaver-like teeth. Humans will not have gills or tails.

…We will not find any unicorns or “crockoducks.” There should never be found any genetic sequences in a starfish that would produce spider-like fangs. An event such as a whales developing shark-like fins would falsify common descent.

While these are all ludicrous examples in the sense that such phenomena would seemingly be impossible, the point is that any life form found with even the slightest cross-phylum, cross-family, cross-genus kind of body type would instantly falsify common descent. And, it doesn’t have to be a known physical characteristic I just listed. It could be a skeletal change in numbers of digits, ribs, or configurations. There is an infinite number of possibilities that if such a life form was unclassifiable, the theory of universal common descent would be falsified."

UNIVERSAL COMMON DESCENT | Intelligent Design (wordpress.com)

I just realized that sequence convergence is not necessarily something we would expect to find because the model predicts nested hierarchies as well from HGT.

What is considered Widespread convergence? Is it anything over 51%?

I definitely can see how natural selection would produce structural and functional convergence, but not so much for mechanistic convergence. Can you give me an example of how this could happen by natural selection on a molecular level? For instance, can you explain how Darwinian evolution used the same genetic formula to turn animals monogamous?

Evolution used same genetic formula to turn animals monogamous – ScienceDaily

No, they share the appearance of common ancestry because they share the same mechanism, such as ERV’s

Potentially in the future, Yes according to the studies I provided.

To answer your second question, HGT from microbes:

“These parasitic entities have been implicated in altering structural, functional and epigenetic variability of their host genome [13], consequently enhancing the evolvability of species and lineages. The persistence of these molecular parasites in the genomes of their hosts may reveal an evolutionary arms race [14], and in some cases, molecular domestication has been reported [15,16].”
[emphasis added]

Confirmation for the model is based on the prediction entailed by the teleological hypothesis that life is ultimately rational; if life was designed, then there is a reason behind its architecture and composition. In this case, since there must be telelogical reasons God used these mechanisms for design, such as HGT and Cytosine deanimation. Then, those reasons can be tested and if confirmed , it produces evidence for the theory.

As Fuz Rana and Mike Gene suggested, the common design model incorporates teleology into its framework. This can be used to make meaningful predictions that were not expected from Darwinian evolution.

For example, "the structural and functional features of the preexisting ERVs (i.e., their capacity to copy themselves and move throughout genomes) are precisely what make these ERV sequences so useful. Their capacity for retrotranspositioning affords these sequences the means to disrupt the endogenization process of invading retroviruses. In other words, for the ERV sequences to operate as antiretroviral elements, they must resemble endogenized retroviruses.

If the creation model perspective on ERVs is valid, then it suggests that ERVs may protect the host cell’s genome from retroviral infections through other mechanisms, like competitive inhibition. Most ERV sequences, like retroviral genomes, consist of two noncoding regions on the 3´ and 5´ ends of the sequence called long terminal repeats (LTRs).

The ERV sequences also contain genes for reverse transcriptase and the proteins located in the virus capsule. If the ERV sequence is transcribed to produce ERV RNA and if the capsid proteins are produced, then both the RNA and the capsid proteins could inhibit the assembly of invading retroviral particles, through competitive inhibition, which would prevent the transmission of the invading retrovirus to other cells. In this scenario, the similarity of the ERVs to retroviruses is crucial."

Koala Endogenous Retroviruses (ERVs) Protect against Retroviral Infections - Reasons to Believe

Thus, not only is there a function for ERV’s, but a rationale for why God designed viruses the way that there are and why there are similarities. This provides evidence that those preadaptions were put there by a designer.

This leads me to explain the teleological reason why Mike genes mechanism was used, which will provide additional evidence that the designer put it there…

Since this is Mike Gene’s model, I think he will do a much better job at explaining it. Here is a summarized version of what he said:

“I don’t think front-loading predicts cytosine would be incorporated into the DNA. I predicted that there would be a reason for including cytosine, and its propensity to deaminate, in response to Poole et al.’s assertion that “Any engineer would have replaced cytosine.” That assertion has been refuted. They asserted that no engineer would have used cytosine as part of the genetic material because of its predisposition for deamination. But it’s exactly this predisposition that might cause an engineer of evolution to include it. A copy of my original description is posted here and I provide a more polished account in my book.”

Again, I never said it would.

I specifically said “generally” and I thought we were comparing octopuses with “jawless” fish. How about you just tell me which octopus and fish families you want me evaluate with the ecology criteria. We can find out with that method.

That can’t be true. For instance, a study showed that high affinity non-functional binding sites are rare:

“In this study, we analyze DNA-binding proteins in 947 bacterial or archaeal genomes and the genomes of 75 eukaryotic species…Our analysis demonstrates that weak binding sites in genomes are preferentially avoided, a result that holds true across the domains of life. Put another way, we show that the global word composition of each genome has been molded by its DNA-binding proteins over the course of evolution.”

Phys. Rev. X 6, 041009 (2016) - Genome-Wide Motif Statistics are Shaped by DNA Binding Proteins over Evolutionary Time Scales (aps.org)

As you can see, if most of the binding was random it would mess up the process of gene regulation because random interactions among genome components would potentially be very deleterious to the organism. Without minimizing these disruptive interactions, biochemical processes in the cell would most likely grind to a halt. This means that most of the binding that was measured by the ENCODE project was probably functional binding.

In fact, a study by Harvard scientists indicated that the concentration of PPI-participating proteins in the cell is also carefully designed. Protein structure and concentrations have to be precisely regulated to promote the PPIs critical for life…

As Fuz Rana suggested, “high-precision structures and interactions, exemplified by PPIs, are hallmark features of biochemical systems and, by analogy to fine-tuned human designs, point to the work of a Creator.”

Topology of protein interaction network shapes protein abundances and strengths of their functional and nonspecific interactions | PNAS

Again, the study I mentioned seems to beg to differ:

"We consider the possibility that natural selection against weak binding sites contributes to this process, and using an evolutionary model we show that the strength of selection needed to maintain global word compositions is on the order of point mutation rates.

Likewise, we show that evolutionary mechanisms based on interference of protein-DNA binding with replication and mutational repair processes could yield similar results and operate with similar rates. On the basis of these modeling and bioinformatic results, we conclude that genome-wide word compositions have been molded by DNA binding proteins acting through tiny evolutionary steps over time scales spanning millions of generations."

Phys. Rev. X 6, 041009 (2016) - Genome-Wide Motif Statistics are Shaped by DNA Binding Proteins over Evolutionary Time Scales (aps.org)

This is because it is not the definition of the causal role function, which “ascribes function to sequences that play some observationally or experimentally determined role in genome structure and/or function.”

Since Fuz Rana is a biochemist, he is more than qualified to speak on such matter:

"The ENCODE Project focused on experimentally determining which sequences in the human genome displayed biochemical activity using assays that measured:

  • transcription,
  • binding of transcription factors to DNA,
  • histone binding to DNA,
  • DNA binding by modified histones,
  • DNA methylation, and
  • three-dimensional interactions between enhancer sequences and genes.

The implied assumption is that if a sequence is involved in any of these processes—all of which play well-established roles in gene regulation—then the sequences must have functional utility." [Emphasis added]

That all depends on what you mean by “know”. It seems as though you are defining it as knowing something with absolute certainty, which is not a standard of proof that science uses to produce truth:

“What do people mean when they say they “know” something in science? It usually means they did an investigation and expended considerable intellectual effort to build a useful explanatory model. It means they are confident about an explanation, believe others should trust what they say, and believe that their claim is testable. It means they can expect to be challenged and called to defend their position, and that their interpretation could eventually be proven “wrong” someday.”

ERIC - EJ997833 - What Does It Mean to Know?, Science and Children, 2012-Jul (ed.gov)

Based on this definition of “know”, the studies I provided did predict which replication will produce a mutation in the regions they are pointing to.

Again, why is this relevant? So what if the same process ends up producing the occasional neutral and detrimental mutations. This is actually expected to happen under my model. What’s the point here?

No, it wouldn’t conflict with the order of appearance. It would be the order of appearance.

There is just an assertion that separately created species should fall into a nested hierarchy with no reasoning whatsoever. You have not put forward a single reason as to why a designer create in such a way.

Humans designed goats that produce spider silk.

Why couldn’t the alleged designer do something similar?

Why does the model predict a nested hierarchy?

Why?

It’s the same pattern of gene expression which can be arrived at by many different DNA sequences. If a mutation is beneficial then it is kept. That is how natural selection works. Natural selection doesn’t care what the underlying sequence is. Natural selection only cares about the phenotype and how it changes the fitness of those who carry the phenotype.

Why would this produce a nested hierarchy?

Then show us. What will humans look like in 1 million years, and why?

Why does this lead to variety and biodiversity?

Your model predicts teleology so you conclude there must be teleology? That’s a circular argument.

Why does this require teleology? Why can’t it happen in the absence of teleology? How do you differentiate between teleology and non-teleology?

What is stopping Mike Gene from finding function in non-teleological systems and falsely concluding there is teleology?

Try all of them. All cephalopods (squid, octopus) have a forward facing retina. All vertebrates have an inverted retina. How do you explain this?

Nowhere in that paper does it claim that all non-functional transcriptional binding sites will be removed from the genomes of complex eukaryotes.

That’s a bad assumption.

If you can’t predict which replication will produce a specific mutation then mutation is random.

Because that is part of the definition of random mutations.

2 Likes

Yes, I did. As the paper I pointed out before suggested [emphasis added]:

" Functional domains of proteins include sites for phosphorylation, glycosylation, ubiquitinilation, sumoylation and glutathionylation, as well as sites that interact with other proteins. The function of such domains is determined by specific sequences of amino acids that must be coded in the DNA. All functional domains and sites will contribute to sequence identity of homologous genes in separate species. Phylogenetic analyses are therefore in effect a genetic mirage—the result of artificial constraints imposed by analyzing functional domains together with non-random mutations—largely determined by the Physico-chemical properties of the DNA sequence and its environment."

(PDF) Shared mutations: Common descent or common mechanism? (researchgate.net)

This goes back to the argument for the function that is found in the non-coding regions of DNA and the so-called junk DNA regions, which give support for the model.

My point is that you are merely assuming those patterns represent common descent or inheritance.
But, those patterns actually represent a common blueprint and mechanism the designer used to front load the reproductive and survival capacity into basic types to adapt to their respective environments (i.e. homoplasy/HGT and non-random mutations/cytosine deamination).

These studies provide support for what I am saying here:

“Furthermore, we show that our proposed model accounts for most of the mutations at neutral sites but it is probably the predominant mechanism at positively selected sites. This suggests that evolution does not proceed by simple random processes but is guided by physical properties of the DNA itself and functional constraint of the proteins encoded by the DNA.”

Evolution: are the monkeys’ typewriters rigged? | Royal Society Open Science (royalsocietypublishing.org)

“As we have presented it here, the key distinction between the origin of life and other ‘emergent’ transitions is the onset of distributed information control, enabling context-dependent causation, where an abstract and non-physical systemic entity (algorithmic information) effectively becomes a causal agent capable of manipulating its material substrate.”

The algorithmic origins of life | Journal of The Royal Society Interface (royalsocietypublishing.org)

Thus, this is why the designer would create in such a way where separately created species fall into a nested hierarchy. It is all about filling the environments of the earth with animals.

Furthermore, finding function in psuedogenes and ERV’s would confirm the common design model because they are separate predictions from common descent (since both make the same predictions regarding phylogenetic patterns).

If God has a human nature that is immutable, we expect God to be consistent with his nature, but also operate like a human would.

For instance, according to the laws of logic, the attributes of God have to work in accordance with each other in a logically consistent manner because he is who he is (i.e. the law of identity) and cannot not be who he is at the same time (i.e. law of non-contradiction).

This means that God cannot make himself cease to exist because this would conflict with him being a necessary being. God cannot make a square-circle because this would conflict with his omniscience. God cannot lie because it would conflict with his omnibenevolence. God cannot make a rock so heavy that he cannot lift because it would conflict with his omni-potency.

Most importantly, God cannot create and develop a world that does not have God intimately involved in the process every step of the way because it would conflict with his “Personal’ nature.

Thus, God must be true to “all” his attributes, because to do otherwise would be to deny his own self.

Because the designer uses a common blueprint and mechanism to front load the reproductive and survival capacity into basic types to adapt to their respective environments (i.e. homoplasy/HGT and non-random mutations/cytosine deamination). These mechanisms naturally produce those patterns.

I guess because a common mechanism implies a common designer, but I think I am getting mixed up with common descent and natural selection, which are related but not the same.

They will look almost exactly the same depending on the context.

Here is how the article explains it:

"Horizontal gene transfer (HGT) enables organisms to acquire pre-existing adaptive characters from other organisms, regardless of phylogenetic distance. Thus, instead of genetic traits within lineages always emerging gradually through successive mutations and selection, evolution is accelerated as a parallel process, where inventions made in different lineages can come together in a single cell through HGT.

…In addition to sharing metabolic capabilities between unrelated organisms, HGT also plays an important role in creating new functional roles for existing proteins by assembling new metabolic pathways. Some pathways that changed the face of planet Earth, such as acetoclastic methanogenesis in Methanosarcina [2,3] were likely assembled through gene transfer. All enzymes involved in the newly identified methylaspartate cycle for acetyl-CoA assimilation in Halobacteriales were acquired through the horizontal transfer and recombination of different pre-existing genes from different bacterial genomes [4]. "

No, you are confusing theory with model. They are not the same thing. The theory predicts teleology because if life was designed by a common designer, then there is a reason behind its architecture and composition. This is my model I was referring to:

Around 3.8 billion years ago, some viruses contained all the required genes to make certain evolutionary trajectories more likely, thus forming the basic types.

Through natural selection, these viruses evolved into different unicellular species in the deep-sea hypothermal vents of the primitive earth, undergoing an extensive amount of HGT.

Subsequently, the designer used HGT and cytosine deamination to develop basic types from different times and global locations. These basic types would branch into diverse progeny to deliberately pioneer environments worldwide over long epochs.

The difference is between randomness and non-randomness and function and non-functional.

Here is Fuz Rana’s explanation:

"According to the evolutionary paradigm, ERVs become instantiated in the genome as a consequence of a retroviral infection of germ line cells (which develop into sperm and egg cells). Because its genetic material is integrated into the gametes’ DNA, the retrovirus is passed on to offspring, becoming a permanent feature of the host genome. If the ERVs experience inactivating mutations, they lose the capacity to spawn new retroviral particles and, consequently, become nonfunctional features of the genome. Recombination events can fragment the ERV sequences and give rise to sequence elements such as LTRs.

…When evolutionary biologists present this type of argument, they make two interrelated assumptions: (1) the ERVs (and derived sequence elements) lack function and (2) the origin stems from rare, random events. As already noted, the question is, why would a Creator introduce nonfunctional ERVs and LTR sequences into an organism’s genome? And if the insertion of retroviruses into the host genome is random, then the only reasonable explanation for shared ERVs at corresponding locations is a common ancestor.

However, if these two assumptions are invalid, then we could legitimately interpret the shared ERVs and LTRs as either common design features (if these sequence elements display function) or the result of nonrandom, repeatable events that took place independently in separate organisms."

Does Retroviral DNA Insert Randomly into Genomes? - Reasons to Believe

Falsifiable and testable predictions that his conclusion must yield.

The differences in eye design seems to be entirely based on how they capture and eat prey according to this study on Octopuses:

"The horizontal pupil is unusual in a predator but apparently there is a good rationale in this species. When dilated the pupil is large and round. When the octopus is diurnally active the pupil will constrict to the slit seen on the cover. This horizontal slit has been shown to limit the luminance without limiting the acuity as if pulling a blind in your house on a sunny day. The eyes are located on the sides of the head with a very limited, if any, binocular visual field. Most investigations of the octopus attack would suggest that these attacks are uniocularly guided. The visual system is coordinated by the connections through the ventral optic commissure, allowing the octopus to access memory units on the contralateral hemisphere as well as coordinate activity of all eight arms.

Hunting strategies are complex and include camouflage and ambush, stalking, chasing, and other more sophisticated techniques. But, mere capture of a well armoured crustacean, such as a crab, does not automatically kill it. The octopus accomplishes the kill by drilling a hole, with its tongue-like radula, in the crab’s carapace, and injecting its venomous saliva. Although poorly understood, the poisonous saliva is used to dispatch the prey and begin the digestion, as well. Some octopus species are known to inject their enzymatic poison through the eye since it is the weakest point in the carapace and allows for the most rapid entry into the body." [emphasis added]

A well armed predator - PMC (nih.gov)

So what? What is your point? Why is this relevant or how does this invalidate my overall point?

Why?

Which they were able to do in the study:

“With independent genomic mutation datasets, including from the largest Arabidopsis mutation accumulation experiment conducted to date, we demonstrate that epigenomic and physical features explain over 90% of variance in the genome-wide pattern of mutation bias surrounding genes. Observed mutation frequencies around genes in turn accurately predict patterns of genetic polymorphisms in natural Arabidopsis accessions (r = 0.96).”

You are assuming that neutral mutations don’t provide a benefit to the organism, but it does according to the ENCODE results as I shown. The frequency of beneficial mutations are also at play as I shown.
In regard to detrimental mutations, the fidelity of the process of DNA replication is also not rare as I shown with the studies.

For these reasons, it can’t be considered random based on this particular definition regardless of the definition:

"(i) they are rare exceptions to the fidelity of the process of DNA replication and because (ii) there is no way of knowing which gene will mutate in a particular cell or in a particular individual.

However, the meaning of “random” that is most significant for understanding the evolutionary process is (iii) that mutations are unoriented with respect to adaptation; they occur independently of whether or not they are beneficial or harmful to the organisms. Some are beneficial, most are not, and only the beneficial ones become incorporated in the organisms" through natural selection."

This would produce exceptions from the expected pattern of a nested hierarchy. Are you now claiming that there shouldn’t be a nested hierarchy? You keep going back and forth between convergence and divergence being the overall trend. Are you swinging back to convergence and homoplasies being the rule?

Front loading, as you describe it, is common ancestry and inheritance. You are claiming that all life shares a common ancestor, and that the process by which they evolved are the very natural processes we see occurring at present. The only other claim you make is that the genome of this very distant common ancestor and those natural processes were front-loaded to produce the species we see today.

There aren’t separately created species in your model. They all share common ancestry.

We already observe massive swaths of the human genome that accumulate mutations at a rate consistent with neutral drift which is massive evidence that they are non-functional. You would need to explain why these sequences can be mutated nearly at will and not lose their function.

Humans can design organisms so that they violate a nested hierarchy. What laws of logic are humans breaking?

Why would this produce a nested hierarchy?

All rivers flow downhill because of the same mechanism which is gravity. Does this mean the path of rivers are designed? Is there anything in nature you can point to that is not designed?

If this is front loaded, then why don’t the same genes move between the same species every time and produce identical genomes? How do we get variety between species?

How do you differentiate between a designer causing HGT and HGT occurring naturally?

If not all non-functional transcription binding sites are removed by natural selection then they will still be present in the genome and falsely assigned function by studies like the ENCODE project.

Neutral mutations by their very definition do not provide benefit to the organism. If they provided benefit then they would be beneficial mutations, not neutral mutations.

Fidelity is not “rare”? What in blazes does that mean?

The same process that produces beneficial mutations also produces neutral and detrimental mutations. The same processes produce mutations all over the genome, and no one can predict which base will be mutated during a single replication. That is random by every measure.

No, that fails to explain nested hierarchy. It might explain similarity, and it might (through different environmental requirements) explain some differences. But it can’t explain why those similarities and differences display nested hierarchy. You think you have an argument for that, but you don’t.

No, they don’t. You are misreading them. Mutation can be biased toward certain types of changes, e.g. transitions over transversaions. Selective constraints can prevent fixation of mutations at certain sites or limit the sorts of mutations that can be fixed. None of that shows what you think.

Nor are we talking about the origin of life here.

Your conclusion doesn’t follow from your premise. Environments are not arranged in a nested hierarchy, so there’s no reason that animals created to fit environments should be either.

No, they don’t make the same predictions regarding phylogenetic patterns. Nor does common descent predict whether pseudogenes or ERVs should or should not be functional. What it does predict is that they should fit into a nested hierarchy, which they do. Pseudogene sequences show nested hierarchy along with their functional relatives. ERV insertions diagnose clades, and their sequences, post-insertion, fit a nested hierarchy. Nothing is what you imagine it is.

Meaningless collections of words do not make any real point.

Another conclusion that doesn’t follow from the premise. If God’s nature is human, then it’s entirely possible for him to fail to act in any particular case, since no human does everything. But of course your God is human when it suits you and inhuman with it suits you, with no consistency whatsoever.

But there is no reason to suppose that such a mechanism would produce those patterns, and even if it would, it wouldn’t operate between basic types. Another failure of your reasoning.

If we took this seriously, it would mean that all the basic types are 3.8 billion years old. Would you confirm that or rephrase the claim?

Whoops, that contradicts the previous statement. I would also point out that cytosine deamination occurs to existing genomes, and thus implies common descent. Are you now saying that new basic types descend from old basic types?

The first assumption is not necessary. The second is compatible with the data.

If they took place independently, why do they fit a common nested hierarchy?

None of that is relevant to the matter at hand: the inverted vs. non-inverted retina. You even bolded a bit that’s about crustacean eyes, suggesting that you don’t read what you quote, merely act like some kind of myopic search bot.

So you had God over for dinner and he told you all these things about himself? Or you read God’s mind?

Apart from this not having any relevance to retinal structure[1], did anyone else notice that the second sentence @Meerkat highlighted is not about the octopus eye at all, but about how the octopus attacks the eyes of its cancrine prey?

Added: @John_Harshman did too.


  1. That paper does go into retinal structure in some detail, and ascribes the difference between fish and octopus to evolution. @Meerkat didn’t quote that section though, despite being far more relevant than the section he did quote: “The octopus eye (an invertebrate eye) resembles a fish eye (a vertebrate eye) and is a good illustration of convergent evolution. On closer examination, however, there are substantial and critical differences illustrating that the eyes of octopuses and of fish do not have a common camera-style eye ancestor, but rather each evolved its individual eye separately. Both the octopus and the fish eye have a camera-style eye with an iris, nearly circular lens, vitreous cavity, and photoreceptor cells lining the interior of the cavity. There the similarity ends. The octopus has no cornea and has a very different retinal anatomy when compared to that same fish. The retina is everted, meaning that the photoreceptive element is directly behind the lens with no interposed ganglion cells or other tissue to interfere with the image resolution.↩︎