Consensus should determine what's taught in science classes. Why?

No, I asked for an example of a real scientific paper from a young-earth creationist.

That comes closer, but if that’s the best you can do, it makes my point. That “paper” is a joke. It contains no actual research, is poorly referenced, poorly edited, and draws laughable conclusions from erroneous interpretation of others’ data. It’s apologetic, not science. Another example of cargo cult science.

Can we assume that just repeating your previous claims is not helping? Especially when I’ve already shown that it’s a nonsensical argument?

More word salad, just repeating previous word salad.

That’s an extremely lame excuse for incompetence. Are not not even the least bit ashamed?

That’s because you “address” objections by simply repeating what you said before. This is not a valid reply.

No. That should be good enough. The proper response to a complaint that your premises don’t follow from your conclusions is to elaborate your argument by explaining the chain of reasoning more completely.

And whole genomes display this convergence? Very unlikely.

Note that the convergence was recognized because other data showed the correct tree. Convergence is the exception that can only be recognized when the true nested hierarchy is clear. That argues against your claim, not for it.

So? Why does functional convergence show different basic types? (And I repeat that your claim of convergence is just wrong in this case; pseudogenes are inherited in the ordinary way and show the standard nested hierarchy; ERVs, once inserted, are inherited in the ordinary way and show the standard nested hierarchy.)

No, you have not clearly shown this and that’s the problem. All you did was sort of explained how the examples I used to describe how the ecology criteria works were not good enough.

But, I added more information to the criteria in order to show that those examples do demonstrate how it works or it can work.

Now, I am waiting for you to explain how it still does not demonstrate how it works or can work.

And you keep responding with assertions rather than well thought out constructive criticisms that would allow me to give you a well thought out response.

My point was that the common mechanisms (not descent) the designer used is what naturally produces the appearance of common ancestry. I am referring to mechanistic convergence here, which involves the multiple independent emergences of biochemical processes that use the same chemical mechanisms. HGT’s and paralogous genes would be examples of this.

Because the common blueprint the designer uses also produces those patterns (i.e. functional requirements or a dependency graph). As the paper I pointed out before suggested:

" Functional domains of proteins include sites for phosphorylation, glycosylation, ubiquitinilation, sumoylation and glutathionylation, as well as sites that interact with other proteins. The function of such domains is determined by specific sequences of amino acids that must be coded in the DNA. All functional domains and sites will contribute to sequence identity of homologous genes in separate species. Phylogenetic analyses are therefore in effect a genetic mirage—the result of artificial constraints imposed by analyzing functional domains together with non-random mutations—largely determined by the Physico-chemical properties of the DNA sequence and its environment."

This goes back to the argument for the function that is found in the non-coding regions of DNA and the so-called junk DNA regions, which give support for the model.

My point is that you are merely assuming those patterns represent common descent or inheritance.

But, those patterns actually represent a common blueprint and mechanism the designer used to front load the reproductive and survival capacity into basic types to adapt to their respective environments.

These studies provides support for what I am saying here:

“Furthermore, we show that our proposed model accounts for most of the mutations at neutral sites but it is probably the predominant mechanism at positively selected sites. This suggests that evolution does not proceed by simple random processes but is guided by physical properties of the DNA itself and functional constraint of the proteins encoded by the DNA.”

Evolution: are the monkeys’ typewriters rigged? | Royal Society Open Science (royalsocietypublishing.org)

“As we have presented it here, the key distinction between the origin of life and other ‘emergent’ transitions is the onset of distributed information control, enabling context-dependent causation, where an abstract and non-physical systemic entity (algorithmic information) effectively becomes a causal agent capable of manipulating its material substrate.”

The algorithmic origins of life | Journal of The Royal Society Interface (royalsocietypublishing.org)

Again, finding function in psuedogenes and ERV’s would confirm the common design model because they are separate predictions from common descent (since both make the same predictions regarding phylogenetic patterns) .

No, you just repeated what you had said before. No new information. In fact you have been repeating the same thing verbatim twice in the same post.

But why should it? Even if all differences among species were functional, why should functions be organized in a nested hierarchy? Why should different functional units be organized in the same nested hierarchy? There’s no reason for that. Contrast with common descent, which naturally and for obvious reasons expects a nested hierarchy.

Do you even know what “paralogous” means?

Why would this common blueprint produce anky such things? Functional requirements do not produce nested hierarchy. Dependency graphs do not produce nested hierarchy, and there is no evidence that there is any sort of dependency pattern anyway.

When you say “the paper I pointed out before”, that’s of no use since you have pointed to many papers before.

You have not clearly articulated a model or explained what predictions this model would make or why it would make them. Unless function is organized in a nested hierarchy, there is no reason to expect functional sequences to be so organized either. And you have given no reason to expect function to be organized in a nested hierarchy. Nor have you provided any argument that all sequences are functional; there is good reason to suppose that most sequences in most genomes are not.

Yes, and that’s because common descent is the only thing we know of that produces a nested hierarchy. It also explains many other patterns, fossil, biogeographic, developmental, etc.

There is no evidence for that. There is no evidence for such a blueprint, and there is no evidence that basic types exist.

They don’t, even if their assertions are true. Even if you understood what they’re saying, which I suspect you do not. I also suspect that you have not actually read those papers but have just pulled the quotes from some creationist web site. Is that true?

You have not explained why function in (a few, not most) pseudogenes and ERVs would not be expected from common descent. You have not explained why it would be expected under your model, whatever it is. And you have not explained why both these element would display nested hierarchy.

In that case, I am going to recount our previous posts and you can see whether there is anything else that needs to be said or clarified to make it into a workable method. it seems to me that I adquately responded to everything and thus, there is no reason to move forward on this point.

Because they respond differently in different environments. For instance, although both Vipers are known to live in rainforest or moist and cool areas, the Fea viper cannot tolerate dry environments while Pit vipers are known to live and tolerate many different environments.

Because the hybridization experiments have presumably been successful in showing a relationship between species of Pit Vipers. As I told you before, this method would disprove the hypothesis that they are separate basic types.

Well, the focus here is just to show how the model can be tested not provide original research results.

How so? explain how it makes it meaningless

The main reason is based on the observations suggesting that the reproductive and adaptive capacity in basic types were front loaded.

I am anwsering this below regarding the front-loading hypothesis

The short answer to first question: functions need to be organized in nested pattern in order to implement the "dual hardware/software nature of the DNA, where genes act both passively as physical structures to be copied, and are actively read-out as a source of algorithmic instructions. "

The short answer to second question: different functional units need to be organized in the same nested hierarchy in order to "supervise which of these two roles the blueprint must play at a given time, thereby ensuring that the blueprint is treated both as an algorithm to be read-out and as a structure to be copied, depending on the context. "

In other words, “Phylogenetic analyses are… the result of artificial constraints imposed by analyzing functional domains together with non-random mutations—largely determined by the Physico-chemical properties of the DNA sequence and its environment.”

(PDF) Shared mutations: Common descent or common mechanism? (researchgate.net)

The long answer to both questions:

"The [Universal Constructor] forms the foundation of von Neumann’s theory on self-replicating automata. However, an UC is a mindless robot, and must be told very specifically exactly what to do in order build the correct object(s). It must therefore be programmed to construct specific things, and if it is to replicate then it must also be provided with a blueprint of itself. However, as von Neumann recognized, implicit in this seemingly innocuous statement is a deep conceptual difficulty concerning the well-known paradoxes of self-reference.

To avoid an infinite regress, in which the blueprint of a self-replicating UC contains the blueprint which contains the blueprint … ad infinitum, von Neumann proposed that in the biological case the blueprint must play a dual role: it should not only contain instructions such as an algorithm, to make a certain kind of machine (e.g. the UC) but should also be blindly copied as a mere physical structure, without reference to the instructions its contains, and thus reference itself only indirectly.

This dual hardware/software role mirrors precisely that played by DNA, where genes act both passively as physical structures to be copied, and are actively read-out as a source of algorithmic instructions. To implement this dualistic role, von Neumann appended a ‘supervisory unit’ to his automata whose task is to supervise which of these two roles the blueprint must play at a given time, thereby ensuring that the blueprint is treated both as an algorithm to be read-out and as a structure to be copied, depending on the context.

In this manner, the organization of a von Neumann automaton ensures that instructions remain logically differentiated from their physical representation. To be functional over successive generations, a complete self-replicating automaton must therefore consist of three components: an UC, a (instructional) blueprint and a supervisory unit." [emphasis added]

The algorithmic origins of life | Journal of The Royal Society Interface (royalsocietypublishing.org)

False. Von Neumann’s work on self-reproducing automata or Cellular automaton is a discrete model of computation studied in automata theory.

FYI, “Automata theory is closely related to formal language theory. In this context, automata are used as finite representations of formal languages that may be infinite. Automata are often classified by the class of formal languages they can recognize, as in the Chomsky hierarchy, which describes a nesting relationship between major classes of automata.” [emphasis added] Here is an example:

Automata theory - Wikipedia

Well, I just showed above how and why function is organized in a nested hierarchy.

So I guess there is no reason to explain any further why we would expect those functional sequences to be organized that way.

I never argued for all but for a vast majority of them to be functional since degradation of an original design is a real thing. More importantly, I have made it very clear that I am not relying on one definition of function but multiple well-established definitions to validate this argument.

You have provided no good reason for why we must use only your personally preferred definition of function.

According to this study, this cannot be the case:

" Both phylogenetic and mismatch-distribution analysis suggest that 9-bp deletion arose independently in sub-Saharan Africa and Asia and that the deletion has arisen more than once in Africa. Within Africa, the deletion was not found among Khoisan peoples and was rare to absent in western and southwestern African populations, but it did occur in Pygmy and Negroid populations from central Africa and in Malawi and southern African Bantu-speakers. The distribution of the 9-bp deletion in Africa suggests that the deletion could have arisen in central Africa and was then introduced to southern Africa via the recent “Bantu expansion.”"

mtDNA control-region sequence variation suggests multiple independent origins of an “Asian-specific” 9-bp deletion in sub-Saharan Africans. - PMC (nih.gov)

As you can see, the common genetic mechanism of non-random mutations can equally explain these findings and others because the exact same mutations can be found in genomes independent of common descent. Thus, natural selection and common descent are not required to explain the distribution of shared mutations.

This is not what the studies I provided suggests:

" In conclusion, it appears that the mutational processes which drive evolution are not random, but rather are the result of properties of the DNA itself in combination with functional constraint at the protein level… If our model and evolutionary predictions are correct, then the extent of the SRaF system at the mitochondrial locus could be used to identify those species that will best respond to rapidly changing environments and prioritize those which need the most attention with regards to their potential for bioenergetic adaptation."

Evolution: are the monkeys’ typewriters rigged? | Royal Society Open Science (royalsocietypublishing.org)

"Purely analogue life forms could have existed in the past but are not likely to survive over geological timescales without acquiring explicitly digitized informational protocols. Therefore, life forms that ‘go digital’ may be the only systems that survive in the long-run and are thus the only remaining product of the processes that led to life.

As such, the onset of Darwinian evolution in a chemical system was probably not the critical step in the emergence of life. As we have discussed, trivially self-replicating systems can accomplish this. Instead, the emergence of life was probably marked by a transition in information processing capabilities." [emphasis added]

The algorithmic origins of life | Journal of The Royal Society Interface (royalsocietypublishing.org)

As these studies suggests, the stepwise evolution of reproductive and survival capabilities did not arise through an unforeseen trial and error process of random mutations and natural select.

Instead, the information that constructs the reproductive and adaptive capacity of the very first microorganisms were front loaded. Then, God used them to construct the very first unrelated created kinds to potentially pioneer environments of the globe.

Nope, I read them. Now, it’s up to you to explain why they somehow do not support my thesis.

I think Edward from Talkorigins explains this well:

"The possibility of identical genetic accidents creating the same two pseudogene or Alu or endogenous retrovirus independently in two different species by chance is so unlikely that it can be dismissed. As in the copyright cases discussed earlier, such shared “errors” indicate that copying of some sort must have occurred.

Since there is no known mechanism by which sequences from modern apes could be copied into the same position of human DNA or vice versa, the existence of shared pseudogenes or retroposons leads to the logical conclusion that both the human and ape sequences were copied from ancestral sequences that must have arisen in a common ancestor of humans and apes."
Plagiarized Errors and Molecular Genetics (talkorigins.org)

Since [non-random mutations and HGT are known mechanisms] by which sequences from modern apes could be copied into the same position of human DNA or vice versa, the existence of shared pseudogenes or retroposons leads to the logical conclusion that both the human and ape sequences were copied from [a common blueprint] that must have [been designed by a common designer] of humans and apes.

Lastly, we would not expect function from psuedogenes in your model by the very definition of it being called “psuedo”. So I don’t know why you wanted me to explain it. Same goes with ERV’s

Please don’t. It’s exceedingly rude to respond to a request not to just repeat the same thing by just repeating, at length, the same thing.

You have made no logical connection between your premise and conclusion. I strongly doubt that it would be possible to make such a connection.

Again, no logical connection.

That’s not just other words; it’s a totally separate claim. And there is no evidence that this claim is correct. Borger is a crank.

Your long answers make no more sense than your short ones.

That’s not a response to my question. And a series of proper subsets does not make a nested hierarchy. I don’t think you understand nested hierarchies, much less biological nested hierarchies.

I’ll accept that you think you did. But it’s just a series of non sequiturs.

More useless verbiage.

You don’t understand that study.

Yes, homoplasy happens. But not enough to obscure the main signal in the data. You have, here, a single example of homoplasy in one indel. You can’t extend that to the entire genomes of every species. You found a three-legged dog and think you have proven that all dogs have three legs.

You have no idea what those studies suggest. And I strongly suspect you have not read any of them. Am I right?

I still doubt you read anything. But you would have to first explain why they somehow do support your thesis.

You also have no idea what Edward (whoever he is) was saying.

Your premise is wrong. Neither HGT nor non-random mutation could explain copying of those sequences into the same position in separate taxa.

No, what we would expect is that the great majority would not have functions, but like any other random mutations we would expect a few to be beneficial. And that’s what we see.

You misunderstood what I meant, which is not your fault. I will just ask this question then.
Is there is anything else that needs to be said or clarified to make the ecology criterion into a workable method? Or is the main issue the justification or basis for the method, such as the claim that reproductive and survival capacity were front loaded?

If it is the latter, then read on…

Here is the front loaded hypothesis I am referring to in detail:

"the investment of specific types of genomic information programmed in the first life forms. This information would shape and constrain subsequent evolution through its dissipation "

Furthermore, the initial life forms were constructed such that the evolution of certain life forms would be more probable. It is designing future states in the present, making use of the evolutionary mechanisms.

These initial life forms would be microorganisms constructed by a universal common designer or constructor. These viruses have: an optimal genetic code ; A genome encoding specific pre-planned information; molecular machines necessary for life.

In its genome, these microbes contain the genes that would be necessary for vertebrate-like creatures to exist (i.e. basic types).

The genetic code in these microbes is optimal in that it minimizes the possibility or impact of deleterious mutations.

There is already support for this hypothesis found in this study:

Results

“… Ac manifests a complex signaling and cell communication repertoire, including a complete tyrosine kinase signaling toolkit and a comparable diversity of predicted extracellular receptors to that found in the facultatively multicellular dictyostelids. An important environmental host of a diverse range of bacteria and viruses, Ac utilizes a diverse repertoire of predicted pattern recognition receptors, many with predicted orthologous functions in the innate immune systems of higher organisms.

Conclusions

Our analysis highlights the important role of LGT in the biology of Ac and in the diversification of microbial eukaryotes. The early evolution of a key signaling facility implicated in the evolution of metazoan multicellularity strongly argues for its emergence early in the Unikont lineage. [emphasis added]

Genome of Acanthamoeba castellanii highlights extensive lateral gene transfer and early evolution of tyrosine kinase signaling | Genome Biology | Full Text (biomedcentral.com)

If the front-loaded hypothesis is true, then we should find non-random mutations and homoplasy in similar basic types

Here are some studies besides the ones I gave you that support non-random mutations:

Evidence of non-random mutation rates suggests an evolutionary risk management strategy | Nature

Mutation bias reflects natural selection in Arabidopsis thaliana | Nature

I am still waiting for you to explain why you say this

And it does not look like you understand automation theory or read the sources on it. Again, cellular automata operates the same way as biological processes, which would include nested patterns.

So it does not make any sense to say they don’t make them, especially when the sources specifically said , “which describes a nesting relationship between major classes of automata .””

Ok John, if this is really the case what do you make of this from the study I gave you before:

“This suggests that evolution does not proceed by simple random processes but is guided by physical properties of the DNA itself and functional constraint of the proteins encoded by the DNA.”

I was not referring to homoplasy here but to non-random mutations, which is another mechanism that can equally explain those findings and others.

That being said, Yes. We can extend homoplasy that far based on the study I gave you on LGT.

It is possible that I misunderstood it, but I definitely read it John. The real question is, did you read them? I am not so sure about that.

My thesis on nested patterns is that they are due to a common blueprint and mechanism the designer used to front load the reproductive and survival capacity into basic types to adapt to their respective environments.

If this is true, then we have good reason to expect the 5 questions to reliably delimit basic types. I think this article explains why the study I provided would suggests this:

"In the new study, the researchers looked at all of the DNA sequences under positive selection (or those that help an organism adapt to its environment), to see whether they were near a repeated sequence.

… The findings could explain why evolution occurs much faster than if mutations were, in fact, totally random, the researchers said. The repeated sequences may also be necessary for evolution, they said.

For example, genetic diversity at these DNA sites could help species adapt to changes in the availability of food and other resources that can result from climate change, Garvin said. So these repeat sequences could be used as a predictor for how a population will respond to environmental changes." [emphasis added]

So the study I provided on HGT does not show this?

No, it is not. You can go read this secondary source to get all the sources containing the studies that show functional ERVs and pseudogenes as well as their commentary from them. The evidence is now overwhelming:

Endogenous Retroviruses (ERVs) Protect Early-Stage Human Embryos - Reasons to Believe

Why would ecology preclude the creation of a species with a mixture of bird and mammal features?

Automata do not fall into a nested hierarchy.

In other words, you have argued up and down that ID/creationism will produce a nested hierarchy, and yet you turn around and now argue that ID/creationism will produce numerous and obvious violations of that nested hierarchy. That is what convergence at the molecular level is, a violation of a nested hierarchy.

You need to make up your mind.

How many functional ERV’s and pseudogenes are there? All of us agree that there were always going to be some functional ERV’s because these DNA sequences start out as functional when they are inserted into the genome. That doesn’t change the fact that we expect the vast majority of ERV’s to be non-functional, and simply citing the existence of functional ERV’s and pseudogenes does not change this conclusion. It’s the numbers that matter.

Yes. But perhaps a few coherent examples would suffice. You have offered one incoherent example, a couple of groups of vipers,

That’s certainly one issue, since unjustified claims are not justification for the method.

What does that even mean?

That study does not support the hypothesis. And the hypothesis itself is incoherent. Nothing you use in support of your claims actually support those claims.

Because “a particular environment or several ones” covers all the possibilities, so cannot distinguish a group from a non-group, as any random assemblage of species will be either in one environment or in several environments.

You clearly don’t understand what a nested hierarchy is. Just using the word “nesting” doesn’t make a publication relevant. Words have meanings, but more importantly, sentences and paragraphs have meanings that aren’t communicated by the individual words taken in isolation. You appear not to have understood any of the random things you quote.

I read enough to know what they were actually about, which isn’t what you imagined.

No, even if that were true we would not have good reason to expect the 5 questions to delimit basic types, and that article explains nothing that you imagine it does. This is all about you reading a few words you like and making up fantasies in your head about what they mean.

It doesn’t. Again, nothing you post says what you think it does, except those from creationist sources, and they don’t understand the evidence they’re trying to present.

Sure I can. But what percentage of human ERVs and pseudogenes have been shown to have a function? Can you say?

1 Like

I can’t wait to see how you guys weasel your way out of this one:

Tree (automata theory) - Wikipedia

Yes, let me clarify this point so we can get on the same page. As Hugh Ross has explained:

"Convergence refers to the occurrence of identical, or nearly identical, anatomical, physiological, and/or genetic features in species of life that are unrelated or distantly related within an evolutionary paradigm. Both theists and nontheists offer explanations for convergence, but those explanations are radically different.

Theists see convergence resulting from supernatural, super-intelligent interventions by a single Creator who employs a single, optimal solution to address a common set of problems faced by organisms possessing different characteristics and living in different habitats.

Nontheists conjecture that convergence occurs when unrelated species encounter identical, or nearly identical, environmental, predatory, and/or competitive selection effects. In other words, nontheists suggest that natural selection channels randomly occurring variations in unrelated species toward identical outcomes."

So we would expect to find convergent evolution in genes and morphology for both models. This includes nested patterns AND their so-called violations of it, as both of you even alluded to. However, the real differences are the frequency in which they happen.

The common descent model claims that it is rare because of the constraints of natural selection acting on random mutations. On the other hand, common design claims that it is ubiquitous because we are dealing with non-random mutations.

It seems that only time will tell from future discoveries. But then again, the ecology criteria might be able to speed up the process and allow us to find out sooner which model is more useful.

Well, it depends. What percentage or number is feasible enough to be considered validation for the common design hypothesis rather than common descent?

Without providing a number on your end, I don’t see the point of researching the number of cases on my end.

Let me bring in Mike Gene, an ID proponent who wrote the Design Matrix, to explain it a different way:

"One of the criticisms of the front-loading hypothesis is that you can’t design a genome in a unicellular organism to evolve specific organs, tissues, biochemical systems, etc., several billion years in the future. But convergent evolution neatly answers this criticism.

A classic example of convergent evolution is the eye in the octopus and the mammalian eye.
The human eye and octopus eye both have the following tissues:

  1. Eyelids.

  2. Cornea.

  3. Pupil.

  4. Iris.

  5. Ciliary muscle.

  6. Lens.

  7. Retina.

  8. Optic nerve.

Furthermore, the arrangement of these parts are practically the same. And these two systems have arisen independently, through convergent evolution (i.e., they are not related through common descent; see Ogura et al., 2004). This means that these two organs have evolved as a result of the initial state of the last common ancestor of mammals and octopuses.

In short, the convergent evolution of these two organs demonstrates that a genome can be programmed to evolve a given objective. If we ran the “clock of life” backwards (to borrow from Stephen J. Gould), human-like eyes would probably appear on the scene once again. In other words, the same system keeps popping up again and again.

And this is evidence that a given objective can be front-loaded, starting with a specified initial state. The eye is a beautiful example of convergent evolution, wherein 8 separate “parts” independently came together in the same arrangement to produce the function of vision.

Are there examples of convergent evolution in biochemical systems? If so, this would provide evidence that not only can organs be front-loaded, but so too can biochemical systems."

Alright, let’s go back to my thesis…nested patterns are due to a common blueprint and mechanisms (i.e. homolpasy and non-random mutations) the designer used to front load the reproductive and survival capacity into basic types to adapt to their respective environments.

This means that the genes necessary for their origin did not have to gradually evolve, because the genes necessary for their origin were in the original life forms.

This also means that different ecology or environments should delineate separate basic types since they are supposed to be preprogrammed to survive and reproduce under a particular environment.

If the hypothesis is incoherent and you don’t understand it, then how do you know the studies don’t support it?

Right, this is why I added more clarity to the criterion in the event that a set of basic types are in the same habitat. Again, If the answer is ‘No’ or ‘TBD’ to the question of “Is there a substantial difference in Habitat?”, then we ask a follow-up question, Do they respond differently in a different habitats?. (this may require artificially planting them in different habitats for an answer)

If the answer is ‘yes’ to either question, we can automatically conclude that God constructed each basic type separately. If not, we rely on other measures to make a definitive conclusion.

Vehicles don’t fall into a nested hierarchy. It is an observable fact. Why do you keep weaseling your way around this fact?

With evolution we only expect superficial similarities, and that is exactly what we see. For example, birds and bats both have wings, but the underlying skeletal structure of the wings are very different. Sharks and dolphins have superficially similar pectoral fins, but the underlying structure of the dolphin fin is much more like the human arm than it is the shark fin.

This shouldn’t be the case with convergence if it is supernaturally created. There is no reason to create only superficial resemblances when it is possible to make the entire feature identical. There is no reason to only change a little sequence here and there so tiny bits are similar. Instead, it is entirely possible to make whole genes the same sequence across distantly related species.

So the expectations are not the same, not by a million miles.

Those eyes are different from one another. For example, the octopus eye has a forward facing retina while the human eye has an inverted retina. Why would this be the case with front loading? We would expect different solutions from evolution, but why so with front loading?

How? How does the front loading work at the molecular level? How are mutations non-random, and where are the observations of this non-random process?

Vertebrate fish occupy the same environment as the octopus, yet they have very different eyes.

No weaseling is necessary or appropriate. That article does nothing to support any claim you have made. It just uses a few words you like, and often in the same paragraphs, even sentences. When will you learn how to read and reason?

Different predictions would result from these premises. Evolution would predict that the convergent features would commonly differ in detail and would in fact show relationships to features in related taxa; creation would predict that “convergent” features would be absolutely identical and would appear in the species as if from nowhere, with no relationships to features in related taxa. Of course we see the former, not the latter, in almost every case.

No, we have enough data already. Separate creation is not tenable.

Considered in isolation, and without regard for the overwhelming evidence that comes from other data, I would say that we would have to discover function for most of them. In theory, they would all be functional, but perhaps we would not be able to find out what some of them did. Obviously that would be too many to examine, but random sampling should be sufficient. Then again, the comparative data tell us this is a stupid thing to look for, because functional sequences should be conserved among species, and neither ERVs nor pseudogenes generally show any such conservation.

I have never heard anyone say that. It’s a strawman. What they actually say is that front-loading is a problem because useless genes won’t wait around for billions of years to become functional in the future. They would tend to be lost and/or evolve beyond recognition. No, convergent evolution doesn’t answer anything, not even the strawman criticism.

This is just a made-up list of parts that your eye has. Octopus eyes don’t even have lids. Do you think at all before you post things?

Only three times, actually. And it’s not the same. The differences in detail are huge, as expected from evolution. Under creation, we would expect them to be identical.

The evidence is against that claim. Different species have somewhat different genes. New ones appear and old ones are lost throughout the history of life. And both appearance and loss follow the same nested hierarchy as do other data. Common descent wins again.

Nothing can support an incoherent hypothesis. Go ahead, prove that Fermat’s last theorem doesn’t support bingo for lasagna purple.

Sorry, makes no sense. Remember that species of viper, which you call a single basic type, are in different habitats, so your expressed criterion would say that they’re all different basic types. Your claims are self-contradictory. But what does “respond differently in different habitats” mean?

2 Likes

It is far from clear what point you are trying to make with that reference.

Yes, a tree is a useful data structure. And an automaton can be used to traverse a tree. But this has very little to do with the fact that biological species naturally fall into a nested hierarchy.

No, they don’t. They assert, but they do not see. They have no actual evidence of supernatural super-intelligent interventions.

1 Like

That’s a data structure used by some automata. It’s not a way to classify automata. No weaselling required.

You clearly didn’t understand your source - if you even read it.

2 Likes

According to cellular automata, a complete self-replicating automaton must consist of three components: an UC, a (instructional) blueprint and a supervisory unit. These functional components are required to produce successive generations of artificial life, which happens to produce patterns that look like nested hierarchy.

This has everything to do with biological life because all known life contains these three components. More importantly, Von neuman’s universal constructor model is an exact representation of my model. For example, the universal common designer, the archetypical blueprint, and homoplasy/ non-random mutations would be the biological version of Von Neuman’s Universal constructor, blueprint, and supervisory unit

However, they are claiming automation theory does not produce or involve nested hierarchies. Instead, they are suggesting that this is all in my imagination even though the sources I provided to them that clearly are referring to algorithmic information that produce nested patterns.

Do you agree with them? Does automation theory have nothing to do with nested patterns nor produce it? Or am I somehow reading the sources wrong? If so, what am I missing?

I am going to need all of you to answer these questions as well to make sure everybody is on the same page… @RonSewell @Rumraket @Mercer

Again, @John_Harshman and @T_aquaticus are implying that I am misinterpreting these sources (and others) that functional components in automation theory produce nested patterns by default, which look like biological life:

Nesting (computing) - Wikipedia

Tree (automata theory) - Wikipedia

What does this have to do with what I referenced before?

Well, I never said the expectations were the same. Remember, there are two hypothesis at play. The front loading hypothesis and the common design hypothesis.

Common design hypothesis suggests that the differences between a particular set of basic types that are similar in morphology and/or moleculars are due to the different design requirements that each of them will need for their environment.

This means that we would expect to find function much more frequently and similar basic types operating in different environments.

For example, the placement of an optic nerve in the human eye has been argued to be flawed when compared to that in the octopus eye because it results in a minor blind spot in our visual field, which does not occur in the octopus eye. However, the different placement of the optic nerve in humans versus cephalopods is actually because of the need for a larger supply of high-acuity vision in warm-blooded animals.

According to Mike gene, the DNA of these initial life forms is made-up of the bases adenine, guanine, thymine and cytosine. Cytosine is used as a base in its DNA because it is prone to deaminaation, which will lead to specfic mutations. Thus, mutations can be channeled in a particular direction by using cytosince deamination.

Here are the observations of non-random mutations:

Evolution: are the monkeys’ typewriters rigged? | Royal Society Open Science (royalsocietypublishing.org)

Evidence of non-random mutation rates suggests an evolutionary risk management strategy | Nature

[Mutation bias reflects natural selection in Arabidopsis thaliana | Nature]
(Mutation bias reflects natural selection in Arabidopsis thaliana | Nature)

What I meant was… According to the common descent model, the environment that is supposed to be responsible for crafting the similar traits. Similar niche – similar trait. Similar environment – similar trait.

Under common design, the designer crafted those organisms to fit those enviroments from the very beginning. As Mike Gene putted it:

“If both metazoans and Monosiga have independently come up with similar or identical mechanisms and architectures in the tyrosine kinase circuits, it would seem we might trace this to an intrinsic , rather than environmental, cause. That is, there was some kind of inherent molecular inertia built into the basic design plan of the TK circuit to cause it to evolve similarly in different creatures experiencing different niches.”

Yes, we actually have many cases like that, such as the centralized nervous systems. In fact, It is not just convergence within an order, class, or phylum, but it is observed in at least eight different phyla.

More than one way to a central nervous system (nature.com)

It’s also widespread within biochemical systems as well:

Convergent evolution: the need to be explicit: Trends in Biochemical Sciences (cell.com)

So over 51%?

No, not all because of the degradation of the original design we would expect from the second law.

Keep in mind, not every part of evolution is programmed and determined according to the front loading hypothesis. It simply means that the initial life forms were constructed in a way that the evolution of certain life forms would be more probable. More importantly, only specific types of genomic information that could shape future evolution would be front-loaded, such as reproductive and survival capabilities.

No, the front loading hypothesis does not involve turning genes on and off. What makes you say that?

No, it would not because hybridization tests have been successful within that species. Again, this method disproves the idea that a set of basic types are separate.

Sorry, I meant to say *Do they respond differently in the same habitat?" if the first question is No or TBD.

No, we’re saying outright that you are misinterpreting them. You misunderstand pretty much every source you cite or quote, with the exception (usually) of creationist web sites, and there the creationists do the misinterpreting for you.

That makes no sense at all, since most vertebrates, the ones with that eye design, are cold-blooded. Please try to think before typing.

Very silly. That’s too blunt an instrument to lead to any sort of adaptive direction. It doesn’t respond to environment and it produces only a particular type of mutation throughout the genome, not focused on any particular spot. It’s the most common mutation type, but does it achieve any particular result?

Then why do they look as if they change environments during evolution? Not making a lot of sense here.

A central nervious system would seem to be primitive within bilateria, along with other characters like a pass-through gut, possibly some kind of visual receptor, etc. Not convergent. More references that don’t mean what you think they do.

Sounds like poor design, then. It appears that God isn’t perfect after all.

That makes the theory so vague as to be able to accommodate anything at all. Anything without reproduction and survival capabllities would neither reproduce nor survive, so that’s a no-brainer. But the nested hierarchy covers much more than such vague pronouncements.

It seemed a reasonable inference from your poorly stated, vague claims. It’s certainly the case for the much clearer claims that other people have made and that got the response I made. Because your description of front-loading carries no meaning, I am forced to imagine what you meant. Don’t blame me for that.

Whatever are you trying to say there? Hybridization within a species? That’s not hybridization, just ordinary reprodution. What species? Vipers are a subfamily with many species. I’m not aware of any hybridization experiments with vipers, and neither are you. Gibberish.

The question remains. What does “respond differently in the same habitat” mean? And why would that be relevant to diagnosis of basic types?

The key phrase here is “a complete self-replicating automaton”. That’s common ancestry. It is the process of replication that produces the nested hierarchy.

If the automatons were separately created then there is no reason why we should see a nested hierarchy.

The sources are saying that the nested hierarchy is produced by replication and common ancestry.

Nothing on that page describes a nested hierarchy of features or sequence. For example, I can imbed the same “for loop” in many different places within a larger nested subroutine. This would violate a nested hierarchy. I do it all of the time when I am writing Python scripts. This also doesn’t relate to physical structures.

Vehicles do not fall into a nested hierarchy. They do not fall into a tree. There is no reason why a creator would need to use a tree to design organisms.

Everything.

Vehicles do not fall into a nested hierarchy. Vehicles are separately created, and there is absolutely no reason why separate creations would need to fall into a nested hierarchy.

Until you supply a process whereby front loading can even work and observations of non-random mutations that can support it, it simply doesn’t work.

So why would this exclude a species with a mixture of bird and mammal features?

Then why do cold-blooded fish living in the same environment as octopusses have an inverted retina?

But that doesn’t channel evolution in any direction with respect to fitness or morphology. A CpG mutation is just as likely to be detrimental or neutral as it is beneficial. It is random.

But they aren’t similar traits once you look at the specifics. For example, the bird and bat wings are entirely different:

If we compared the sequences for those genes what do you think we would see? Would we see the exact pattern of sequence divergence that we would expect from evolution, or would we see sequence convergence which you claim is the product of design?

Well, I never said this would exclude it in the first place.

I don’t know, but I don’t know what you are getting at here.

According to Mike Gene and the study he referenced, it does lead to benefits:

"…spontaneous deamination of cytosine can lead to a base substitution known as a transition, where C is replaced by T (and G is replaced by A on the other strand of DNA). We might expect such mutations to be quite common, as the rate constant for cytosine deamination at 37 degree C in single stranded DNA translates into a half-life for any specific cytosine of about 200 years. In fact, such high rates of deamination led researchers Poole et. al to complain of “confounded cytosine!”

Confounded cytosine! Tinkering and the evolution of DNA | Nature Reviews Molecular Cell Biology

There is a paywall but you can read a snippet of the article here:

Antievolution.org - Antievolution.org Discussion Board -Topic::ID,antievolution?

We would see sequence convergence, of course.

Mike Gene explains his hypothesis in detail:

"[T]he original cells were designed and such design entailed a degree of front-loading, in which certain evolutionary trajectories were made more likely. The initial designed state would translate as a front-loaded state given that evolution borrows rather than invents de novo. That is, evolution could “unpack” buried designs. The bias toward C-T transitions could thus be used to exploit this front-loaded state.

For example, protein X could be designed to fulfill function A. But buried in the sequence of X is the potential for function B (that is, function B is nearby in sequence space relative to sequence X/function A). A gene duplication, followed by exposure to the C-T mutational “stream,” could unlock function B, if the original sequence X was specifically designed to be unlocked by such mutations.

It is difficult to test such a hypothesis, as the unfolding of this front-loaded state may have been limited such that it did not reach very far from the original state. Nevertheless, the fact that the genetic code has been essentially frozen since its design may provide generic clues to the type of evolution that was/is built into life. Put simply, given that the genetic code itself was designed to minimize deleterious mutations, perhaps it was likewise designed to exploit the evolutionary potential of C-T transitions."

For more information, read this: Cytosine Deamination and Evolution | (wordpress.com)

Apparently, you are alone there John because @T_aquaticus specifically said that:

"The key phrase here is “a complete self-replicating automaton”. That’s common ancestry. It is the process of replication that produces the nested hierarchy.

If the automatons were separately created, then there is no reason why we should see a nested hierarchy."

What @T_aquaticus does not realize is that he is making my point because common design does not claim that nested hierarchies represent patterns of relatedness like common descent. Instead, it claims that these nested hierarchies represent a dependency graph based on 3 functional components: Universal common designer, blueprint, and homoplasy/non-random mutations.

Here is Mike Gene’s explanation for this:

"From a design perspective, this model of evolution need not exert its effects in a ubiquitous fashion. Instead, perhaps only key events in life’s evolution were significantly helped by this “directed evolution.” However, there is an unfortunate caveat worth mentioning. The ability for C-T transition to unmask front-loaded states may have long ceased to exist.

Such a dynamic may have been crucial to some early events in evolution, yet given these states may have dissipated (the proximal objectives were reached), current mutations may no longer reflect any detectable design bias. In such a case, the current predominance of C-T transitions (involved in some disease states) may simply be a vestige of design.

Although the above model is mostly speculative, and has yet to include other forms of mutation, it can serve as one platform for further design investigations." [emphasis added]

Well, you are entitled to your opinion, but the study has made it very clear that it is convergence we are dealing with here. Besides, this is just the most profound example of convergence to date because these features are found in so many other phyla. But, there are other examples like it that span many different clades, such as the appendix:

Morphological evolution of the mammalian cecum and cecal appendix - ScienceDirect
Evolution Of The Human Appendix: A Biological ‘Remnant’ No More – ScienceDaily

No, it would just mean his proximal design objectives were reached.

Well, there is actually some truth to what you said here but it applies more so to the common design hypothesis and convergence. For instance…

“According to a new study that looked at 10 species of vertebrates, evolution used a kind of universal formula for turning non-monogamous species into monogamous species – turning up the activity of some genes and turning down others in the brain.”

Evolution used same genetic formula to turn animals monogamous – ScienceDaily

Conserved transcriptomic profiles underpin monogamy across vertebrates | PNAS

So after the front-loading secretion process was done, certain genes were turned on and off depending on the context.

I am saying the separate species of Pit Vipers that live in different habitats would not be considered separate basic types because of what you just said. We know already that they can reproduce with each other, which means they are related by a common ancestor of Pit Vipers.

What we don’t know is whether the Pit Viper species can hybridize with Fea Viper species. If the hybridization atttempt is succesful, then we know they came from the same created kind. If the test was unsuccessful, we use the ecolog critera to discover whether they are separate basic types. This leads me to address…

On second thought, I think both questions can potentially be useful. So let me address the last question first.

Do they respond differently in different habitats?

When you place Fea vipers in any environment that is not moist and cool, it cannot tolerate it. On the other hand, we know that Pit vipers can live and tolerate other environments besides moist and cool areas.

*Do they respond differently in the same habitat?

I was referring to how both vipers shelter in that habitat when they want to avoid predators, lay eggs and hibernate.

Then your model doesn’t predict a nested hierarchy.

You claimed that vertebrates were designed with an inverted retina because such an arrangement was necessary to supply more blood in warm blooded animals. Fish aren’t warm blooded, and yet they have an inverted retina. How do you explain this?

What benefits? How do CpG mutations direct evolution in a certain direction?

But we don’t. We see divergence as shown by the nested hierarchy.

Great, so you should be able to answer the questions in your own words.

What is the process of front loading, how does it work? What are the physical mechanisms?

Where are the observations of non-random mutations?

How does this process guide evolution along specific paths and not others?

1 Like

You don’t understand what people tell you here either.

Not an explanation. You misunderstand even “Mike Gene”, and in a different way from his misunderstanding of biology.

But why would he allow degradation of his objectives? That’s a design flaw.

Where is your evidence that separate species of pit vipers can reproduce with each other? Where is your evidence that different basic types can’t hybridize?

Wrong. We know that some species of pit vipers tolerate other environments, and other species can’t.

What do you know about these differences, and how would you know if they occur in different basic types or just one?

Gibberish of gibberish, all is gibberish.

No, you are just imposing your model of nested patterns onto mine. Again, common design does not claim that nested hierarchies represent patterns of relatedness like common descent. Instead, it claims that these nested hierarchies represent a dependency graph based on 3 functional components: Universal common designer, blueprint, and homoplasy/non-random mutations.

These functional elements predict and produce nested patterns as you even acknowledged. You just assume or explain it with common descent.

The differences between fish and human eye are due to the different design requirements each will need for their environment.

Fish operate marine environments and humans operate terrestrial ones.

Remember, the initial designed state of unicellular organisms contained all the genes needed to direct evolution according to observations. Given the genetic code itself was designed to minimize deleterious mutations, it seems likely that it was also designed to exploit the evolutionary potential of C-T transitions.

So the bias toward C-T transitions would just be used to exploit this front-loaded state and “unpack” buried designs. As a result, certain evolutionary trajectories were made more likely for the formation of basic types. For example, “a gene duplication, followed by exposure to the C-T mutational “stream,” could unlock function B, if the original sequence X was specifically designed to be unlocked by such mutations.” as Mike gene suggested.

In contrast, other forms of non-random mutations were likely at play with the evolution of species within a basic type. This includes other mechanisms, such as natural selection that played a pivotal role in accessing that buried design. I gave you the studies already that reveal those other non-random mutations.

This is not true. We do see examples of sequence convergence (as well as many other types of convergence). Here is one example of it:

Convergent evolution of major histocompatibility complex molecules in humans and New World monkeys | Request PDF (researchgate.net)

Let’s consider the mutagenic effects of deamination among the four nucleotides used by DNA. As Mike gene pointed out:

"A single-stranded DNA molecule with 2 million bases will experience a single deamination event involving cytosine every 2.8 hours (at pH 7.4 and 37 degree C). In contrast, it would take 140 hours for an adenine to experience deamination.

Given that guanine deaminates at rates similar to adenine, and thymine lacks an exocyclic base, and thus experiences no deamination, we can see that the simple process of deamination would strongly favor cytosine as a target."

This type of physical bias of C-T transitions has been shown to be exerted in living things according to a number of studies:

Spectrum of spontaneous mutation at the APRT locus of Chinese hamster ovary cells: an analysis at the DNA sequence level. | PNAS

Specificity of spontaneous mutation in the lacI gene cloned into bacteriophage M13 - ScienceDirect

NO, I just should have used my own words to address your points. Let me do that now…

Remember, the initial designed state of unicellular organisms contained all the genes needed to direct evolution according to observations. Given the genetic code itself was designed to minimize deleterious mutations, it seems likely that it was also designed to exploit the evolutionary potential of C-T transitions.

So the bias toward C-T transitions would just be used to exploit this front-loaded state and “unpack” buried designs. As a result, certain evolutionary trajectories were made more likely for the formation of basic types. For example, “a gene duplication, followed by exposure to the C-T mutational “stream,” could unlock function B, if the original sequence X was specifically designed to be unlocked by such mutations.” as Mike gene suggested.

In contrast, other forms of non-random mutations were likely at play with the evolution of species within a basic type. This includes other mechanisms, such as natural selection that played a pivotal role in accessing that buried design. I gave you the studies already that reveal those other mutations.

Well, no. He does not allow degradation of his objectives. God maintains the original designs by editing and limiting the harmful genetic changes (Martincorena et al., 2012; Martincorena & Luscombe, 2013; Garvin & Gharrett, 2014). More importantly, researchers have suggested that there is a design trade-off at play:

“Mutagenesis appears to be the inevitable outcome of cellular wear and tear and is not necessarily cancer associated. Perhaps mutagenesis is not due to failings of DNA quality control mechanisms. Rather, such pathways may be naturally limited in activity, resulting in permissiveness to mutagenesis. We suggest that this is a prioritization of survival over genomic perfection, given that most DNA damage is inconsequential and thus, affordable.”

Cellular survival over genomic perfection | Science

Nonetheless, God seems to regulate the harmful mutations that do arise in a way that preserves a balance between predator and prey populations because too many predators or prey can cause a collapse of the ecosystem (Moore et al., 2010; Smee, 2012; Gilljam, 2015). [just ask for reference]

For instance, mitochondrial and chloroplast DNA are also abundantly involved in apoptosis, which is the single most important feature of multicellularity because it ensures timely death of individual cells. Cancer may be the ecological equivalent of apoptosis, ensuring the timely death of individuals so that resources are available for the young.

I am not an expert in ecology, so I can’t give you the details at the moment.